Dec 02 22:53:02 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 02 22:53:02 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 02 22:53:02 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 02 22:53:02 localhost kernel: BIOS-provided physical RAM map:
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 02 22:53:02 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 02 22:53:02 localhost kernel: NX (Execute Disable) protection: active
Dec 02 22:53:02 localhost kernel: APIC: Static calls initialized
Dec 02 22:53:02 localhost kernel: SMBIOS 2.8 present.
Dec 02 22:53:02 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 02 22:53:02 localhost kernel: Hypervisor detected: KVM
Dec 02 22:53:02 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 02 22:53:02 localhost kernel: kvm-clock: using sched offset of 65437968844 cycles
Dec 02 22:53:02 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 02 22:53:02 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 02 22:53:02 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 02 22:53:02 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 02 22:53:02 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 02 22:53:02 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 02 22:53:02 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 02 22:53:02 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 02 22:53:02 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 02 22:53:02 localhost kernel: Using GB pages for direct mapping
Dec 02 22:53:02 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 02 22:53:02 localhost kernel: ACPI: Early table checksum verification disabled
Dec 02 22:53:02 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 02 22:53:02 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:53:02 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:53:02 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:53:02 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 02 22:53:02 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:53:02 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:53:02 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 02 22:53:02 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 02 22:53:02 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 02 22:53:02 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 02 22:53:02 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 02 22:53:02 localhost kernel: No NUMA configuration found
Dec 02 22:53:02 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 02 22:53:02 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 02 22:53:02 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 02 22:53:02 localhost kernel: Zone ranges:
Dec 02 22:53:02 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 02 22:53:02 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 02 22:53:02 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 02 22:53:02 localhost kernel:   Device   empty
Dec 02 22:53:02 localhost kernel: Movable zone start for each node
Dec 02 22:53:02 localhost kernel: Early memory node ranges
Dec 02 22:53:02 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 02 22:53:02 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 02 22:53:02 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 02 22:53:02 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 02 22:53:02 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 02 22:53:02 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 02 22:53:02 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 02 22:53:02 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 02 22:53:02 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 02 22:53:02 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 02 22:53:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 02 22:53:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 02 22:53:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 02 22:53:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 02 22:53:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 02 22:53:02 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 02 22:53:02 localhost kernel: TSC deadline timer available
Dec 02 22:53:02 localhost kernel: CPU topo: Max. logical packages:   8
Dec 02 22:53:02 localhost kernel: CPU topo: Max. logical dies:       8
Dec 02 22:53:02 localhost kernel: CPU topo: Max. dies per package:   1
Dec 02 22:53:02 localhost kernel: CPU topo: Max. threads per core:   1
Dec 02 22:53:02 localhost kernel: CPU topo: Num. cores per package:     1
Dec 02 22:53:02 localhost kernel: CPU topo: Num. threads per package:   1
Dec 02 22:53:02 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 02 22:53:02 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 02 22:53:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 02 22:53:02 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 02 22:53:02 localhost kernel: Booting paravirtualized kernel on KVM
Dec 02 22:53:02 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 02 22:53:02 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 02 22:53:02 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 02 22:53:02 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 02 22:53:02 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 02 22:53:02 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 02 22:53:02 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 02 22:53:02 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 02 22:53:02 localhost kernel: random: crng init done
Dec 02 22:53:02 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 02 22:53:02 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 02 22:53:02 localhost kernel: Fallback order for Node 0: 0 
Dec 02 22:53:02 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 02 22:53:02 localhost kernel: Policy zone: Normal
Dec 02 22:53:02 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 02 22:53:02 localhost kernel: software IO TLB: area num 8.
Dec 02 22:53:02 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 02 22:53:02 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 02 22:53:02 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 02 22:53:02 localhost kernel: Dynamic Preempt: voluntary
Dec 02 22:53:02 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 02 22:53:02 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 02 22:53:02 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 02 22:53:02 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 02 22:53:02 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 02 22:53:02 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 02 22:53:02 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 02 22:53:02 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 02 22:53:02 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 02 22:53:02 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 02 22:53:02 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 02 22:53:02 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 02 22:53:02 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 02 22:53:02 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 02 22:53:02 localhost kernel: Console: colour VGA+ 80x25
Dec 02 22:53:02 localhost kernel: printk: console [ttyS0] enabled
Dec 02 22:53:02 localhost kernel: ACPI: Core revision 20230331
Dec 02 22:53:02 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 02 22:53:02 localhost kernel: x2apic enabled
Dec 02 22:53:02 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 02 22:53:02 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 02 22:53:02 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 02 22:53:02 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 02 22:53:02 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 02 22:53:02 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 02 22:53:02 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 02 22:53:02 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 02 22:53:02 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 02 22:53:02 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 02 22:53:02 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 02 22:53:02 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 02 22:53:02 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 02 22:53:02 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 02 22:53:02 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 02 22:53:02 localhost kernel: x86/bugs: return thunk changed
Dec 02 22:53:02 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 02 22:53:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 02 22:53:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 02 22:53:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 02 22:53:02 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 02 22:53:02 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 02 22:53:02 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 02 22:53:02 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 02 22:53:02 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 02 22:53:02 localhost kernel: landlock: Up and running.
Dec 02 22:53:02 localhost kernel: Yama: becoming mindful.
Dec 02 22:53:02 localhost kernel: SELinux:  Initializing.
Dec 02 22:53:02 localhost kernel: LSM support for eBPF active
Dec 02 22:53:02 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 02 22:53:02 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 02 22:53:02 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 02 22:53:02 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 02 22:53:02 localhost kernel: ... version:                0
Dec 02 22:53:02 localhost kernel: ... bit width:              48
Dec 02 22:53:02 localhost kernel: ... generic registers:      6
Dec 02 22:53:02 localhost kernel: ... value mask:             0000ffffffffffff
Dec 02 22:53:02 localhost kernel: ... max period:             00007fffffffffff
Dec 02 22:53:02 localhost kernel: ... fixed-purpose events:   0
Dec 02 22:53:02 localhost kernel: ... event mask:             000000000000003f
Dec 02 22:53:02 localhost kernel: signal: max sigframe size: 1776
Dec 02 22:53:02 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 02 22:53:02 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 02 22:53:02 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 02 22:53:02 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 02 22:53:02 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 02 22:53:02 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 02 22:53:02 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 02 22:53:02 localhost kernel: node 0 deferred pages initialised in 10ms
Dec 02 22:53:02 localhost kernel: Memory: 7763716K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618212K reserved, 0K cma-reserved)
Dec 02 22:53:02 localhost kernel: devtmpfs: initialized
Dec 02 22:53:02 localhost kernel: x86/mm: Memory block size: 128MB
Dec 02 22:53:02 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 02 22:53:02 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 02 22:53:02 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 02 22:53:02 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 02 22:53:02 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 02 22:53:02 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 02 22:53:02 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 02 22:53:02 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 02 22:53:02 localhost kernel: audit: type=2000 audit(1764715980.005:1): state=initialized audit_enabled=0 res=1
Dec 02 22:53:02 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 02 22:53:02 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 02 22:53:02 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 02 22:53:02 localhost kernel: cpuidle: using governor menu
Dec 02 22:53:02 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 02 22:53:02 localhost kernel: PCI: Using configuration type 1 for base access
Dec 02 22:53:02 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 02 22:53:02 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 02 22:53:02 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 02 22:53:02 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 02 22:53:02 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 02 22:53:02 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 02 22:53:02 localhost kernel: Demotion targets for Node 0: null
Dec 02 22:53:02 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 02 22:53:02 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 02 22:53:02 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 02 22:53:02 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 02 22:53:02 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 02 22:53:02 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 02 22:53:02 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 02 22:53:02 localhost kernel: ACPI: Interpreter enabled
Dec 02 22:53:02 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 02 22:53:02 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 02 22:53:02 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 02 22:53:02 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 02 22:53:02 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 02 22:53:02 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 02 22:53:02 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [3] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [4] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [5] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [6] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [7] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [8] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [9] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [10] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [11] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [12] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [13] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [14] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [15] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [16] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [17] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [18] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [19] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [20] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [21] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [22] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [23] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [24] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [25] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [26] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [27] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [28] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [29] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [30] registered
Dec 02 22:53:02 localhost kernel: acpiphp: Slot [31] registered
Dec 02 22:53:02 localhost kernel: PCI host bridge to bus 0000:00
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 02 22:53:02 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 02 22:53:02 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 02 22:53:02 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 02 22:53:02 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 02 22:53:02 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 02 22:53:02 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 02 22:53:02 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 02 22:53:02 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 02 22:53:02 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 02 22:53:02 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 02 22:53:02 localhost kernel: iommu: Default domain type: Translated
Dec 02 22:53:02 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 02 22:53:02 localhost kernel: SCSI subsystem initialized
Dec 02 22:53:02 localhost kernel: ACPI: bus type USB registered
Dec 02 22:53:02 localhost kernel: usbcore: registered new interface driver usbfs
Dec 02 22:53:02 localhost kernel: usbcore: registered new interface driver hub
Dec 02 22:53:02 localhost kernel: usbcore: registered new device driver usb
Dec 02 22:53:02 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 02 22:53:02 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 02 22:53:02 localhost kernel: PTP clock support registered
Dec 02 22:53:02 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 02 22:53:02 localhost kernel: NetLabel: Initializing
Dec 02 22:53:02 localhost kernel: NetLabel:  domain hash size = 128
Dec 02 22:53:02 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 02 22:53:02 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 02 22:53:02 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 02 22:53:02 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 02 22:53:02 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 02 22:53:02 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 02 22:53:02 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 02 22:53:02 localhost kernel: vgaarb: loaded
Dec 02 22:53:02 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 02 22:53:02 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 02 22:53:02 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 02 22:53:02 localhost kernel: pnp: PnP ACPI init
Dec 02 22:53:02 localhost kernel: pnp 00:03: [dma 2]
Dec 02 22:53:02 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 02 22:53:02 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 02 22:53:02 localhost kernel: NET: Registered PF_INET protocol family
Dec 02 22:53:02 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 02 22:53:02 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 02 22:53:02 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 02 22:53:02 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 02 22:53:02 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 02 22:53:02 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 02 22:53:02 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 02 22:53:02 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 02 22:53:02 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 02 22:53:02 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 02 22:53:02 localhost kernel: NET: Registered PF_XDP protocol family
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 02 22:53:02 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 02 22:53:02 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 02 22:53:02 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 02 22:53:02 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 104136 usecs
Dec 02 22:53:02 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 02 22:53:02 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 02 22:53:02 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 02 22:53:02 localhost kernel: ACPI: bus type thunderbolt registered
Dec 02 22:53:02 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 02 22:53:02 localhost kernel: Initialise system trusted keyrings
Dec 02 22:53:02 localhost kernel: Key type blacklist registered
Dec 02 22:53:02 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 02 22:53:02 localhost kernel: zbud: loaded
Dec 02 22:53:02 localhost kernel: integrity: Platform Keyring initialized
Dec 02 22:53:02 localhost kernel: integrity: Machine keyring initialized
Dec 02 22:53:02 localhost kernel: Freeing initrd memory: 87804K
Dec 02 22:53:02 localhost kernel: NET: Registered PF_ALG protocol family
Dec 02 22:53:02 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 02 22:53:02 localhost kernel: Key type asymmetric registered
Dec 02 22:53:02 localhost kernel: Asymmetric key parser 'x509' registered
Dec 02 22:53:02 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 02 22:53:02 localhost kernel: io scheduler mq-deadline registered
Dec 02 22:53:02 localhost kernel: io scheduler kyber registered
Dec 02 22:53:02 localhost kernel: io scheduler bfq registered
Dec 02 22:53:02 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 02 22:53:02 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 02 22:53:02 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 02 22:53:02 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 02 22:53:02 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 02 22:53:02 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 02 22:53:02 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 02 22:53:02 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 02 22:53:02 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 02 22:53:02 localhost kernel: Non-volatile memory driver v1.3
Dec 02 22:53:02 localhost kernel: rdac: device handler registered
Dec 02 22:53:02 localhost kernel: hp_sw: device handler registered
Dec 02 22:53:02 localhost kernel: emc: device handler registered
Dec 02 22:53:02 localhost kernel: alua: device handler registered
Dec 02 22:53:02 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 02 22:53:02 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 02 22:53:02 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 02 22:53:02 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 02 22:53:02 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 02 22:53:02 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 02 22:53:02 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 02 22:53:02 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 02 22:53:02 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 02 22:53:02 localhost kernel: hub 1-0:1.0: USB hub found
Dec 02 22:53:02 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 02 22:53:02 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 02 22:53:02 localhost kernel: usbserial: USB Serial support registered for generic
Dec 02 22:53:02 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 02 22:53:02 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 02 22:53:02 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 02 22:53:02 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 02 22:53:02 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 02 22:53:02 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 02 22:53:02 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 02 22:53:02 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-02T22:53:01 UTC (1764715981)
Dec 02 22:53:02 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 02 22:53:02 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 02 22:53:02 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 02 22:53:02 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 02 22:53:02 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 02 22:53:02 localhost kernel: usbcore: registered new interface driver usbhid
Dec 02 22:53:02 localhost kernel: usbhid: USB HID core driver
Dec 02 22:53:02 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 02 22:53:02 localhost kernel: Initializing XFRM netlink socket
Dec 02 22:53:02 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 02 22:53:02 localhost kernel: Segment Routing with IPv6
Dec 02 22:53:02 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 02 22:53:02 localhost kernel: mpls_gso: MPLS GSO support
Dec 02 22:53:02 localhost kernel: IPI shorthand broadcast: enabled
Dec 02 22:53:02 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 02 22:53:02 localhost kernel: AES CTR mode by8 optimization enabled
Dec 02 22:53:02 localhost kernel: sched_clock: Marking stable (1248002267, 154116900)->(1537703318, -135584151)
Dec 02 22:53:02 localhost kernel: registered taskstats version 1
Dec 02 22:53:02 localhost kernel: Loading compiled-in X.509 certificates
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 02 22:53:02 localhost kernel: Demotion targets for Node 0: null
Dec 02 22:53:02 localhost kernel: page_owner is disabled
Dec 02 22:53:02 localhost kernel: Key type .fscrypt registered
Dec 02 22:53:02 localhost kernel: Key type fscrypt-provisioning registered
Dec 02 22:53:02 localhost kernel: Key type big_key registered
Dec 02 22:53:02 localhost kernel: Key type encrypted registered
Dec 02 22:53:02 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 02 22:53:02 localhost kernel: Loading compiled-in module X.509 certificates
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 02 22:53:02 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 02 22:53:02 localhost kernel: ima: No architecture policies found
Dec 02 22:53:02 localhost kernel: evm: Initialising EVM extended attributes:
Dec 02 22:53:02 localhost kernel: evm: security.selinux
Dec 02 22:53:02 localhost kernel: evm: security.SMACK64 (disabled)
Dec 02 22:53:02 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 02 22:53:02 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 02 22:53:02 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 02 22:53:02 localhost kernel: evm: security.apparmor (disabled)
Dec 02 22:53:02 localhost kernel: evm: security.ima
Dec 02 22:53:02 localhost kernel: evm: security.capability
Dec 02 22:53:02 localhost kernel: evm: HMAC attrs: 0x1
Dec 02 22:53:02 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 02 22:53:02 localhost kernel: Running certificate verification RSA selftest
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 02 22:53:02 localhost kernel: Running certificate verification ECDSA selftest
Dec 02 22:53:02 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 02 22:53:02 localhost kernel: clk: Disabling unused clocks
Dec 02 22:53:02 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 02 22:53:02 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 02 22:53:02 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 02 22:53:02 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 02 22:53:02 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 02 22:53:02 localhost kernel: Run /init as init process
Dec 02 22:53:02 localhost kernel:   with arguments:
Dec 02 22:53:02 localhost kernel:     /init
Dec 02 22:53:02 localhost kernel:   with environment:
Dec 02 22:53:02 localhost kernel:     HOME=/
Dec 02 22:53:02 localhost kernel:     TERM=linux
Dec 02 22:53:02 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 02 22:53:02 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 22:53:02 localhost systemd[1]: Detected virtualization kvm.
Dec 02 22:53:02 localhost systemd[1]: Detected architecture x86-64.
Dec 02 22:53:02 localhost systemd[1]: Running in initrd.
Dec 02 22:53:02 localhost systemd[1]: No hostname configured, using default hostname.
Dec 02 22:53:02 localhost systemd[1]: Hostname set to <localhost>.
Dec 02 22:53:02 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 02 22:53:02 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 02 22:53:02 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 02 22:53:02 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 02 22:53:02 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 02 22:53:02 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 02 22:53:02 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 02 22:53:02 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 02 22:53:02 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 02 22:53:02 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 02 22:53:02 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 02 22:53:02 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 02 22:53:02 localhost systemd[1]: Reached target Local File Systems.
Dec 02 22:53:02 localhost systemd[1]: Reached target Path Units.
Dec 02 22:53:02 localhost systemd[1]: Reached target Slice Units.
Dec 02 22:53:02 localhost systemd[1]: Reached target Swaps.
Dec 02 22:53:02 localhost systemd[1]: Reached target Timer Units.
Dec 02 22:53:02 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 02 22:53:02 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 02 22:53:02 localhost systemd[1]: Listening on Journal Socket.
Dec 02 22:53:02 localhost systemd[1]: Listening on udev Control Socket.
Dec 02 22:53:02 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 02 22:53:02 localhost systemd[1]: Reached target Socket Units.
Dec 02 22:53:02 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 02 22:53:02 localhost systemd[1]: Starting Journal Service...
Dec 02 22:53:02 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 02 22:53:02 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 02 22:53:02 localhost systemd[1]: Starting Create System Users...
Dec 02 22:53:02 localhost systemd[1]: Starting Setup Virtual Console...
Dec 02 22:53:02 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 02 22:53:02 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 02 22:53:02 localhost systemd-journald[306]: Journal started
Dec 02 22:53:02 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/8b5693ff2e2545a5bebe492dc3141f79) is 8.0M, max 153.6M, 145.6M free.
Dec 02 22:53:02 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Dec 02 22:53:02 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Dec 02 22:53:02 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 02 22:53:02 localhost systemd[1]: Started Journal Service.
Dec 02 22:53:02 localhost systemd[1]: Finished Create System Users.
Dec 02 22:53:02 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 02 22:53:02 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 02 22:53:02 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 02 22:53:02 localhost systemd[1]: Finished Setup Virtual Console.
Dec 02 22:53:02 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 02 22:53:02 localhost systemd[1]: Starting dracut cmdline hook...
Dec 02 22:53:02 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 02 22:53:02 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Dec 02 22:53:02 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 02 22:53:02 localhost systemd[1]: Finished dracut cmdline hook.
Dec 02 22:53:02 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 02 22:53:02 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 02 22:53:02 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 02 22:53:02 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 02 22:53:02 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 02 22:53:02 localhost kernel: RPC: Registered udp transport module.
Dec 02 22:53:02 localhost kernel: RPC: Registered tcp transport module.
Dec 02 22:53:02 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 02 22:53:02 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 02 22:53:02 localhost rpc.statd[443]: Version 2.5.4 starting
Dec 02 22:53:02 localhost rpc.statd[443]: Initializing NSM state
Dec 02 22:53:02 localhost rpc.idmapd[448]: Setting log level to 0
Dec 02 22:53:02 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 02 22:53:02 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 22:53:02 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 22:53:02 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 22:53:02 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 02 22:53:02 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 02 22:53:02 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 02 22:53:02 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 02 22:53:02 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 22:53:02 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 02 22:53:02 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 22:53:02 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 22:53:03 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 02 22:53:03 localhost systemd[1]: Reached target Network.
Dec 02 22:53:03 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 02 22:53:03 localhost systemd[1]: Starting dracut initqueue hook...
Dec 02 22:53:03 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 02 22:53:03 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 02 22:53:03 localhost kernel:  vda: vda1
Dec 02 22:53:03 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 02 22:53:03 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 02 22:53:03 localhost systemd[1]: Reached target System Initialization.
Dec 02 22:53:03 localhost systemd[1]: Reached target Basic System.
Dec 02 22:53:03 localhost kernel: libata version 3.00 loaded.
Dec 02 22:53:03 localhost systemd-udevd[497]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 22:53:03 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 02 22:53:03 localhost kernel: scsi host0: ata_piix
Dec 02 22:53:03 localhost kernel: scsi host1: ata_piix
Dec 02 22:53:03 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 02 22:53:03 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 02 22:53:03 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 02 22:53:03 localhost systemd[1]: Reached target Initrd Root Device.
Dec 02 22:53:03 localhost kernel: ata1: found unknown device (class 0)
Dec 02 22:53:03 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 02 22:53:03 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 02 22:53:03 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 02 22:53:03 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 02 22:53:03 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 02 22:53:03 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 02 22:53:03 localhost systemd[1]: Finished dracut initqueue hook.
Dec 02 22:53:03 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 02 22:53:03 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 02 22:53:03 localhost systemd[1]: Reached target Remote File Systems.
Dec 02 22:53:03 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 02 22:53:03 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 02 22:53:03 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 02 22:53:03 localhost systemd-fsck[552]: /usr/sbin/fsck.xfs: XFS file system.
Dec 02 22:53:03 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 02 22:53:03 localhost systemd[1]: Mounting /sysroot...
Dec 02 22:53:04 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 02 22:53:04 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 02 22:53:04 localhost kernel: XFS (vda1): Ending clean mount
Dec 02 22:53:04 localhost systemd[1]: Mounted /sysroot.
Dec 02 22:53:04 localhost systemd[1]: Reached target Initrd Root File System.
Dec 02 22:53:04 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 02 22:53:04 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 02 22:53:04 localhost systemd[1]: Reached target Initrd File Systems.
Dec 02 22:53:04 localhost systemd[1]: Reached target Initrd Default Target.
Dec 02 22:53:04 localhost systemd[1]: Starting dracut mount hook...
Dec 02 22:53:04 localhost systemd[1]: Finished dracut mount hook.
Dec 02 22:53:04 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 02 22:53:04 localhost rpc.idmapd[448]: exiting on signal 15
Dec 02 22:53:04 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 02 22:53:04 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 02 22:53:04 localhost systemd[1]: Stopped target Network.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Timer Units.
Dec 02 22:53:04 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 02 22:53:04 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Basic System.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Path Units.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Remote File Systems.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Slice Units.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Socket Units.
Dec 02 22:53:04 localhost systemd[1]: Stopped target System Initialization.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Local File Systems.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Swaps.
Dec 02 22:53:04 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut mount hook.
Dec 02 22:53:04 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 02 22:53:04 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 02 22:53:04 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 02 22:53:04 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 02 22:53:04 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 02 22:53:04 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 02 22:53:04 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 02 22:53:04 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 02 22:53:04 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 02 22:53:04 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 02 22:53:04 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 02 22:53:04 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 02 22:53:04 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Closed udev Control Socket.
Dec 02 22:53:04 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Closed udev Kernel Socket.
Dec 02 22:53:04 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 02 22:53:04 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 02 22:53:04 localhost systemd[1]: Starting Cleanup udev Database...
Dec 02 22:53:04 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 02 22:53:04 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 02 22:53:04 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Stopped Create System Users.
Dec 02 22:53:04 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 02 22:53:04 localhost systemd[1]: Finished Cleanup udev Database.
Dec 02 22:53:04 localhost systemd[1]: Reached target Switch Root.
Dec 02 22:53:04 localhost systemd[1]: Starting Switch Root...
Dec 02 22:53:04 localhost systemd[1]: Switching root.
Dec 02 22:53:04 localhost systemd-journald[306]: Journal stopped
Dec 02 22:53:05 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Dec 02 22:53:05 localhost kernel: audit: type=1404 audit(1764715984.614:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability open_perms=1
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 22:53:05 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 22:53:05 localhost kernel: audit: type=1403 audit(1764715984.736:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 02 22:53:05 localhost systemd[1]: Successfully loaded SELinux policy in 124.477ms.
Dec 02 22:53:05 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.244ms.
Dec 02 22:53:05 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 22:53:05 localhost systemd[1]: Detected virtualization kvm.
Dec 02 22:53:05 localhost systemd[1]: Detected architecture x86-64.
Dec 02 22:53:05 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 22:53:05 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 02 22:53:05 localhost systemd[1]: Stopped Switch Root.
Dec 02 22:53:05 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 02 22:53:05 localhost systemd[1]: Created slice Slice /system/getty.
Dec 02 22:53:05 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 02 22:53:05 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 02 22:53:05 localhost systemd[1]: Created slice User and Session Slice.
Dec 02 22:53:05 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 02 22:53:05 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 02 22:53:05 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 02 22:53:05 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 02 22:53:05 localhost systemd[1]: Stopped target Switch Root.
Dec 02 22:53:05 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 02 22:53:05 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 02 22:53:05 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 02 22:53:05 localhost systemd[1]: Reached target Path Units.
Dec 02 22:53:05 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 02 22:53:05 localhost systemd[1]: Reached target Slice Units.
Dec 02 22:53:05 localhost systemd[1]: Reached target Swaps.
Dec 02 22:53:05 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 02 22:53:05 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 02 22:53:05 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 02 22:53:05 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 02 22:53:05 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 02 22:53:05 localhost systemd[1]: Listening on udev Control Socket.
Dec 02 22:53:05 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 02 22:53:05 localhost systemd[1]: Mounting Huge Pages File System...
Dec 02 22:53:05 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 02 22:53:05 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 02 22:53:05 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 02 22:53:05 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 22:53:05 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 02 22:53:05 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 22:53:05 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 02 22:53:05 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 02 22:53:05 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 02 22:53:05 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 02 22:53:05 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 02 22:53:05 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 02 22:53:05 localhost systemd[1]: Stopped Journal Service.
Dec 02 22:53:05 localhost kernel: fuse: init (API version 7.37)
Dec 02 22:53:05 localhost systemd[1]: Starting Journal Service...
Dec 02 22:53:05 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 02 22:53:05 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 02 22:53:05 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 22:53:05 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 02 22:53:05 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 02 22:53:05 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 02 22:53:05 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 02 22:53:05 localhost systemd-journald[680]: Journal started
Dec 02 22:53:05 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 02 22:53:05 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 02 22:53:05 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 02 22:53:05 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 02 22:53:05 localhost systemd[1]: Started Journal Service.
Dec 02 22:53:05 localhost systemd[1]: Mounted Huge Pages File System.
Dec 02 22:53:05 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 02 22:53:05 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 02 22:53:05 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 02 22:53:05 localhost kernel: ACPI: bus type drm_connector registered
Dec 02 22:53:05 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 02 22:53:05 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 22:53:05 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 22:53:05 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 02 22:53:05 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 02 22:53:05 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 02 22:53:05 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 02 22:53:05 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 02 22:53:05 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 02 22:53:05 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 02 22:53:05 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 02 22:53:05 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 02 22:53:05 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 02 22:53:05 localhost systemd[1]: Mounting FUSE Control File System...
Dec 02 22:53:05 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 02 22:53:05 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 02 22:53:05 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 02 22:53:05 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 02 22:53:05 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 02 22:53:05 localhost systemd[1]: Starting Create System Users...
Dec 02 22:53:05 localhost systemd[1]: Mounted FUSE Control File System.
Dec 02 22:53:05 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 02 22:53:05 localhost systemd-journald[680]: Received client request to flush runtime journal.
Dec 02 22:53:05 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 02 22:53:05 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 02 22:53:05 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 02 22:53:05 localhost systemd[1]: Finished Create System Users.
Dec 02 22:53:05 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 02 22:53:05 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 02 22:53:05 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 02 22:53:05 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 02 22:53:05 localhost systemd[1]: Reached target Local File Systems.
Dec 02 22:53:05 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 02 22:53:05 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 02 22:53:05 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 02 22:53:05 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 02 22:53:05 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 02 22:53:05 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 02 22:53:05 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 02 22:53:05 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Dec 02 22:53:05 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 02 22:53:05 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 02 22:53:05 localhost systemd[1]: Starting Security Auditing Service...
Dec 02 22:53:05 localhost systemd[1]: Starting RPC Bind...
Dec 02 22:53:05 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 02 22:53:05 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 02 22:53:05 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 02 22:53:05 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 02 22:53:05 localhost systemd[1]: Started RPC Bind.
Dec 02 22:53:05 localhost augenrules[708]: /sbin/augenrules: No change
Dec 02 22:53:05 localhost augenrules[723]: No rules
Dec 02 22:53:05 localhost augenrules[723]: enabled 1
Dec 02 22:53:05 localhost augenrules[723]: failure 1
Dec 02 22:53:05 localhost augenrules[723]: pid 703
Dec 02 22:53:05 localhost augenrules[723]: rate_limit 0
Dec 02 22:53:05 localhost augenrules[723]: backlog_limit 8192
Dec 02 22:53:05 localhost augenrules[723]: lost 0
Dec 02 22:53:05 localhost augenrules[723]: backlog 1
Dec 02 22:53:05 localhost augenrules[723]: backlog_wait_time 60000
Dec 02 22:53:05 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 02 22:53:05 localhost augenrules[723]: enabled 1
Dec 02 22:53:05 localhost augenrules[723]: failure 1
Dec 02 22:53:05 localhost augenrules[723]: pid 703
Dec 02 22:53:05 localhost augenrules[723]: rate_limit 0
Dec 02 22:53:05 localhost augenrules[723]: backlog_limit 8192
Dec 02 22:53:05 localhost augenrules[723]: lost 0
Dec 02 22:53:05 localhost augenrules[723]: backlog 0
Dec 02 22:53:05 localhost augenrules[723]: backlog_wait_time 60000
Dec 02 22:53:05 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 02 22:53:05 localhost augenrules[723]: enabled 1
Dec 02 22:53:05 localhost augenrules[723]: failure 1
Dec 02 22:53:05 localhost augenrules[723]: pid 703
Dec 02 22:53:05 localhost augenrules[723]: rate_limit 0
Dec 02 22:53:05 localhost augenrules[723]: backlog_limit 8192
Dec 02 22:53:05 localhost augenrules[723]: lost 0
Dec 02 22:53:05 localhost augenrules[723]: backlog 3
Dec 02 22:53:05 localhost augenrules[723]: backlog_wait_time 60000
Dec 02 22:53:05 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 02 22:53:05 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 02 22:53:05 localhost systemd[1]: Started Security Auditing Service.
Dec 02 22:53:05 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 02 22:53:05 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 02 22:53:05 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 02 22:53:05 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 22:53:05 localhost systemd[1]: Starting Update is Completed...
Dec 02 22:53:05 localhost systemd[1]: Finished Update is Completed.
Dec 02 22:53:06 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 22:53:06 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 22:53:06 localhost systemd[1]: Reached target System Initialization.
Dec 02 22:53:06 localhost systemd[1]: Started dnf makecache --timer.
Dec 02 22:53:06 localhost systemd[1]: Started Daily rotation of log files.
Dec 02 22:53:06 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 02 22:53:06 localhost systemd[1]: Reached target Timer Units.
Dec 02 22:53:06 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 02 22:53:06 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 02 22:53:06 localhost systemd[1]: Reached target Socket Units.
Dec 02 22:53:06 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 02 22:53:06 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 22:53:06 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 02 22:53:06 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 22:53:06 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 22:53:06 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 22:53:06 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 02 22:53:06 localhost systemd[1]: Reached target Basic System.
Dec 02 22:53:06 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 02 22:53:06 localhost dbus-broker-lau[769]: Ready
Dec 02 22:53:06 localhost systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 22:53:06 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 02 22:53:06 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 02 22:53:06 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 02 22:53:06 localhost systemd[1]: Starting NTP client/server...
Dec 02 22:53:06 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 02 22:53:06 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 02 22:53:06 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 02 22:53:06 localhost systemd[1]: Started irqbalance daemon.
Dec 02 22:53:06 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 02 22:53:06 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 22:53:06 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 22:53:06 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 22:53:06 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 02 22:53:06 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 02 22:53:06 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 02 22:53:06 localhost systemd[1]: Starting User Login Management...
Dec 02 22:53:06 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 02 22:53:06 localhost chronyd[802]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 22:53:06 localhost chronyd[802]: Loaded 0 symmetric keys
Dec 02 22:53:06 localhost chronyd[802]: Using right/UTC timezone to obtain leap second data
Dec 02 22:53:06 localhost chronyd[802]: Loaded seccomp filter (level 2)
Dec 02 22:53:06 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 02 22:53:06 localhost systemd[1]: Started NTP client/server.
Dec 02 22:53:06 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 02 22:53:06 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 02 22:53:06 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 02 22:53:06 localhost kernel: Console: switching to colour dummy device 80x25
Dec 02 22:53:06 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 02 22:53:06 localhost kernel: [drm] features: -context_init
Dec 02 22:53:06 localhost kernel: [drm] number of scanouts: 1
Dec 02 22:53:06 localhost kernel: [drm] number of cap sets: 0
Dec 02 22:53:06 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 02 22:53:06 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 02 22:53:06 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 02 22:53:06 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 02 22:53:06 localhost systemd-logind[790]: New seat seat0.
Dec 02 22:53:06 localhost systemd-logind[790]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 02 22:53:06 localhost systemd-logind[790]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 02 22:53:06 localhost kernel: kvm_amd: TSC scaling supported
Dec 02 22:53:06 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 02 22:53:06 localhost kernel: kvm_amd: Nested Paging enabled
Dec 02 22:53:06 localhost kernel: kvm_amd: LBR virtualization supported
Dec 02 22:53:06 localhost systemd[1]: Started User Login Management.
Dec 02 22:53:06 localhost iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Dec 02 22:53:06 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 02 22:53:06 localhost cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 02 Dec 2025 22:53:06 +0000. Up 6.33 seconds.
Dec 02 22:53:06 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 02 22:53:06 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 02 22:53:06 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp_t8_e5uy.mount: Deactivated successfully.
Dec 02 22:53:06 localhost systemd[1]: Starting Hostname Service...
Dec 02 22:53:07 localhost systemd[1]: Started Hostname Service.
Dec 02 22:53:07 np0005542928.novalocal systemd-hostnamed[855]: Hostname set to <np0005542928.novalocal> (static)
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Reached target Preparation for Network.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Starting Network Manager...
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2134] NetworkManager (version 1.54.1-1.el9) is starting... (boot:f1d327b1-3d2c-4b37-9ec1-7a4c7cbc8c21)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2137] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2200] manager[0x559763ca7080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2246] hostname: hostname: using hostnamed
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2246] hostname: static hostname changed from (none) to "np0005542928.novalocal"
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2249] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2361] manager[0x559763ca7080]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2362] manager[0x559763ca7080]: rfkill: WWAN hardware radio set enabled
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2458] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2459] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2460] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2461] manager: Networking is enabled by state file
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2464] settings: Loaded settings plugin: keyfile (internal)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2481] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2514] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2537] dhcp: init: Using DHCP client 'internal'
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2542] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2566] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2577] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2590] device (lo): Activation: starting connection 'lo' (625c3601-1ec7-443e-9214-f1cc220bd16a)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2607] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2613] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2655] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2661] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2666] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2669] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2673] device (eth0): carrier: link connected
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2679] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2697] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2706] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2710] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2711] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2712] manager: NetworkManager state is now CONNECTING
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2714] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2720] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Started Network Manager.
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Reached target Network.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2856] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2859] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 02 22:53:07 np0005542928.novalocal NetworkManager[859]: <info>  [1764715987.2869] device (lo): Activation: successful, device activated.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Reached target NFS client services.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: Reached target Remote File Systems.
Dec 02 22:53:07 np0005542928.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9212] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9237] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9280] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9329] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9332] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9343] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9358] device (eth0): Activation: successful, device activated.
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9366] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 22:53:08 np0005542928.novalocal NetworkManager[859]: <info>  [1764715988.9378] manager: startup complete
Dec 02 22:53:08 np0005542928.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 02 22:53:08 np0005542928.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 02 Dec 2025 22:53:09 +0000. Up 8.98 seconds.
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |  eth0  | True |         38.102.83.74         | 255.255.255.0 | global | fa:16:3e:e1:9e:c5 |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fee1:9ec5/64 |       .       |  link  | fa:16:3e:e1:9e:c5 |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Dec 02 22:53:09 np0005542928.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 22:53:10 np0005542928.novalocal useradd[990]: new group: name=cloud-user, GID=1001
Dec 02 22:53:10 np0005542928.novalocal useradd[990]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 02 22:53:10 np0005542928.novalocal useradd[990]: add 'cloud-user' to group 'adm'
Dec 02 22:53:10 np0005542928.novalocal useradd[990]: add 'cloud-user' to group 'systemd-journal'
Dec 02 22:53:10 np0005542928.novalocal useradd[990]: add 'cloud-user' to shadow group 'adm'
Dec 02 22:53:10 np0005542928.novalocal useradd[990]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Generating public/private rsa key pair.
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: The key fingerprint is:
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: SHA256:QUDOEVvy5scxPDiyoY64fYos5mviu+XB8gnQEFv5Blk root@np0005542928.novalocal
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: The key's randomart image is:
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: +---[RSA 3072]----+
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |. .+E.*oo        |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: | ++  o B o       |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |o  o  * * =      |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: | o  o. * + +     |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |. ... . S o      |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |...o     .       |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |.o.+.            |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |+** +            |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |XB**             |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: The key fingerprint is:
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: SHA256:BDA9Mz1jJtY7TalLWCwU+GAdJ6PgsiF6gxazjA5i4V8 root@np0005542928.novalocal
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: The key's randomart image is:
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: +---[ECDSA 256]---+
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |  . o*B*.  .     |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: | . .+oX=X o      |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |++...+ @.B       |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |*==   o.= .      |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |B*o  E .So       |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |*....   .        |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: | . .             |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |                 |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |                 |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: The key fingerprint is:
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: SHA256:Qr89N4l1d2QrqDsAZei1qPOrvSl4xNj/14S+h1B9q40 root@np0005542928.novalocal
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: The key's randomart image is:
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: +--[ED25519 256]--+
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |      .          |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |     . +         |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |    . * . .     o|
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |     = o . ... o.|
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |  + . o S ..o.o.o|
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: | . *   + +.+ +...|
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |  o +   =.B B    |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: | . o.o.  =.E o   |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: |  ..o=+..o+      |
Dec 02 22:53:10 np0005542928.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Reached target Network is Online.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting System Logging Service...
Dec 02 22:53:10 np0005542928.novalocal sm-notify[1006]: Version 2.5.4 starting
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting Permit User Sessions...
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 02 22:53:10 np0005542928.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 02 22:53:10 np0005542928.novalocal sshd[1008]: Server listening on :: port 22.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Finished Permit User Sessions.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Started Command Scheduler.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Started Getty on tty1.
Dec 02 22:53:10 np0005542928.novalocal crond[1012]: (CRON) STARTUP (1.5.7)
Dec 02 22:53:10 np0005542928.novalocal crond[1012]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 02 22:53:10 np0005542928.novalocal crond[1012]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 93% if used.)
Dec 02 22:53:10 np0005542928.novalocal crond[1012]: (CRON) INFO (running with inotify support)
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Reached target Login Prompts.
Dec 02 22:53:10 np0005542928.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec 02 22:53:10 np0005542928.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Started System Logging Service.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Reached target Multi-User System.
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1025]: Unable to negotiate with 38.102.83.114 port 36924: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 02 22:53:10 np0005542928.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1034]: Connection reset by 38.102.83.114 port 36930 [preauth]
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1043]: Unable to negotiate with 38.102.83.114 port 36940: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1053]: Unable to negotiate with 38.102.83.114 port 36952: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 02 22:53:10 np0005542928.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1011]: Connection closed by 38.102.83.114 port 36920 [preauth]
Dec 02 22:53:10 np0005542928.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Dec 02 22:53:10 np0005542928.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1084]: Connection reset by 38.102.83.114 port 36970 [preauth]
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1096]: Unable to negotiate with 38.102.83.114 port 36974: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 02 22:53:10 np0005542928.novalocal sshd-session[1109]: Unable to negotiate with 38.102.83.114 port 36984: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 02 22:53:11 np0005542928.novalocal sshd-session[1064]: Connection closed by 38.102.83.114 port 36958 [preauth]
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1175]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 02 Dec 2025 22:53:11 +0000. Up 10.77 seconds.
Dec 02 22:53:11 np0005542928.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 02 22:53:11 np0005542928.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 02 22:53:11 np0005542928.novalocal dracut[1285]: dracut-057-102.git20250818.el9
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1307]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 02 Dec 2025 22:53:11 +0000. Up 11.21 seconds.
Dec 02 22:53:11 np0005542928.novalocal dracut[1287]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1332]: #############################################################
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1334]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1342]: 256 SHA256:BDA9Mz1jJtY7TalLWCwU+GAdJ6PgsiF6gxazjA5i4V8 root@np0005542928.novalocal (ECDSA)
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1353]: 256 SHA256:Qr89N4l1d2QrqDsAZei1qPOrvSl4xNj/14S+h1B9q40 root@np0005542928.novalocal (ED25519)
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1360]: 3072 SHA256:QUDOEVvy5scxPDiyoY64fYos5mviu+XB8gnQEFv5Blk root@np0005542928.novalocal (RSA)
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1361]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1365]: #############################################################
Dec 02 22:53:11 np0005542928.novalocal cloud-init[1307]: Cloud-init v. 24.4-7.el9 finished at Tue, 02 Dec 2025 22:53:11 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.38 seconds
Dec 02 22:53:11 np0005542928.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 02 22:53:11 np0005542928.novalocal systemd[1]: Reached target Cloud-init target.
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: memstrack is not available
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 02 22:53:12 np0005542928.novalocal dracut[1287]: memstrack is not available
Dec 02 22:53:13 np0005542928.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 02 22:53:13 np0005542928.novalocal dracut[1287]: *** Including module: systemd ***
Dec 02 22:53:13 np0005542928.novalocal dracut[1287]: *** Including module: fips ***
Dec 02 22:53:13 np0005542928.novalocal dracut[1287]: *** Including module: systemd-initrd ***
Dec 02 22:53:13 np0005542928.novalocal dracut[1287]: *** Including module: i18n ***
Dec 02 22:53:13 np0005542928.novalocal chronyd[802]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Dec 02 22:53:13 np0005542928.novalocal chronyd[802]: System clock TAI offset set to 37 seconds
Dec 02 22:53:13 np0005542928.novalocal dracut[1287]: *** Including module: drm ***
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]: *** Including module: prefixdevname ***
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]: *** Including module: kernel-modules ***
Dec 02 22:53:14 np0005542928.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]: *** Including module: kernel-modules-extra ***
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]: *** Including module: qemu ***
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]: *** Including module: fstab-sys ***
Dec 02 22:53:14 np0005542928.novalocal dracut[1287]: *** Including module: rootfs-block ***
Dec 02 22:53:15 np0005542928.novalocal dracut[1287]: *** Including module: terminfo ***
Dec 02 22:53:15 np0005542928.novalocal dracut[1287]: *** Including module: udev-rules ***
Dec 02 22:53:15 np0005542928.novalocal dracut[1287]: Skipping udev rule: 91-permissions.rules
Dec 02 22:53:15 np0005542928.novalocal dracut[1287]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 02 22:53:15 np0005542928.novalocal dracut[1287]: *** Including module: virtiofs ***
Dec 02 22:53:15 np0005542928.novalocal dracut[1287]: *** Including module: dracut-systemd ***
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]: *** Including module: usrmount ***
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]: *** Including module: base ***
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]: *** Including module: fs-lib ***
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]: *** Including module: kdumpbase ***
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: IRQ 25 affinity is now unmanaged
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: IRQ 31 affinity is now unmanaged
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: IRQ 28 affinity is now unmanaged
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: IRQ 32 affinity is now unmanaged
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: IRQ 30 affinity is now unmanaged
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 02 22:53:16 np0005542928.novalocal irqbalance[787]: IRQ 29 affinity is now unmanaged
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:   microcode_ctl module: mangling fw_dir
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel" is ignored
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 02 22:53:16 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]: *** Including module: openssl ***
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]: *** Including module: shutdown ***
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]: *** Including module: squash ***
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]: *** Including modules done ***
Dec 02 22:53:17 np0005542928.novalocal dracut[1287]: *** Installing kernel module dependencies ***
Dec 02 22:53:18 np0005542928.novalocal dracut[1287]: *** Installing kernel module dependencies done ***
Dec 02 22:53:18 np0005542928.novalocal dracut[1287]: *** Resolving executable dependencies ***
Dec 02 22:53:19 np0005542928.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 22:53:20 np0005542928.novalocal dracut[1287]: *** Resolving executable dependencies done ***
Dec 02 22:53:20 np0005542928.novalocal dracut[1287]: *** Generating early-microcode cpio image ***
Dec 02 22:53:20 np0005542928.novalocal dracut[1287]: *** Store current command line parameters ***
Dec 02 22:53:20 np0005542928.novalocal dracut[1287]: Stored kernel commandline:
Dec 02 22:53:20 np0005542928.novalocal dracut[1287]: No dracut internal kernel commandline stored in the initramfs
Dec 02 22:53:20 np0005542928.novalocal dracut[1287]: *** Install squash loader ***
Dec 02 22:53:21 np0005542928.novalocal dracut[1287]: *** Squashing the files inside the initramfs ***
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: *** Squashing the files inside the initramfs done ***
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: *** Hardlinking files ***
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Mode:           real
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Files:          50
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Linked:         0 files
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Compared:       0 xattrs
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Compared:       0 files
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Saved:          0 B
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: Duration:       0.000962 seconds
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: *** Hardlinking files done ***
Dec 02 22:53:22 np0005542928.novalocal dracut[1287]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 02 22:53:23 np0005542928.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Dec 02 22:53:23 np0005542928.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Dec 02 22:53:23 np0005542928.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 02 22:53:23 np0005542928.novalocal systemd[1]: Startup finished in 1.543s (kernel) + 2.782s (initrd) + 18.707s (userspace) = 23.033s.
Dec 02 22:53:26 np0005542928.novalocal sshd-session[3789]: Invalid user support from 80.94.95.116 port 18638
Dec 02 22:53:26 np0005542928.novalocal sshd-session[3789]: Connection closed by invalid user support 80.94.95.116 port 18638 [preauth]
Dec 02 22:53:31 np0005542928.novalocal sshd-session[4299]: Accepted publickey for zuul from 38.102.83.114 port 35692 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 02 22:53:31 np0005542928.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 02 22:53:31 np0005542928.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 02 22:53:31 np0005542928.novalocal systemd-logind[790]: New session 1 of user zuul.
Dec 02 22:53:31 np0005542928.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 02 22:53:31 np0005542928.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Queued start job for default target Main User Target.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Created slice User Application Slice.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Reached target Paths.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Reached target Timers.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Starting D-Bus User Message Bus Socket...
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Starting Create User's Volatile Files and Directories...
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Finished Create User's Volatile Files and Directories.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Listening on D-Bus User Message Bus Socket.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Reached target Sockets.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Reached target Basic System.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Reached target Main User Target.
Dec 02 22:53:31 np0005542928.novalocal systemd[4303]: Startup finished in 179ms.
Dec 02 22:53:31 np0005542928.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 02 22:53:31 np0005542928.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 02 22:53:31 np0005542928.novalocal sshd-session[4299]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 22:53:32 np0005542928.novalocal python3[4386]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:36 np0005542928.novalocal python3[4414]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:37 np0005542928.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 22:53:37 np0005542928.novalocal sshd-session[4449]: Received disconnect from 80.94.93.119 port 59092:11:  [preauth]
Dec 02 22:53:37 np0005542928.novalocal sshd-session[4449]: Disconnected from authenticating user root 80.94.93.119 port 59092 [preauth]
Dec 02 22:53:41 np0005542928.novalocal python3[4476]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:43 np0005542928.novalocal python3[4516]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 02 22:53:45 np0005542928.novalocal python3[4542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD4U+kYD0NRncUxX+JcUB7DRqILQVJDR8uhD3ks0Ul/EeWgVXEffIQ6qoDCZfMe9U2/TbSPjeQxRgrxfeojshOb90kAetShddcGy8qV3/MPy+wQVVV1rYoL3quzM5Aq/+yxxiNhtzetxzg9+fYQ2RCmPT7lduoZwxU6u936ZxDFI68NvtLWahzQ+M1heDP7uxDZQ9tlgqT8eJifbx4ZTmiC+l6jhgkBDnXRH6h2kIx0R+gfYMaSYKEzVhzG6w1nXZR6xFPAwbbZcUIMhMqO/V4dbStGfEcoodG69OYWlpuu93qNro9BrqffizKa2kknlLLFvgueRkCId9XrXN6p7IuzIjTV0h3DIY3uLZmygElHHoI8eKSOWIxwqSq866WuVWi4PlJY4XAhlllCpoOs8C9ugUhElgy1QCzbZerHV5m5m+Z42nwFUfeUQFIloFxeTlJIWCIXceAA1LqMifrZXliCnI20C7EFs4NwaJq4wBnBa7i/RKHSIWz3twW0fOCBh5c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:46 np0005542928.novalocal python3[4566]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:46 np0005542928.novalocal python3[4665]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:53:47 np0005542928.novalocal python3[4736]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764716026.298835-230-159300447719155/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=248d0ab726be451cb68f679daa67a6cc_id_rsa follow=False checksum=ecd6e0f03c6e65bc1b100de91867e2771eda57bc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:47 np0005542928.novalocal python3[4859]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:53:48 np0005542928.novalocal python3[4930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764716027.305092-274-244322235898861/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=248d0ab726be451cb68f679daa67a6cc_id_rsa.pub follow=False checksum=b88ecc87674b1a86abb7c76bf7f4e81098e02b1a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:49 np0005542928.novalocal python3[4978]: ansible-ping Invoked with data=pong
Dec 02 22:53:50 np0005542928.novalocal python3[5002]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:52 np0005542928.novalocal python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 02 22:53:53 np0005542928.novalocal python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:53 np0005542928.novalocal python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:53 np0005542928.novalocal python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:54 np0005542928.novalocal python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:54 np0005542928.novalocal python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:54 np0005542928.novalocal python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:56 np0005542928.novalocal sudo[5236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwnbeqoekuxyexqcjuohedfiiqisxgdx ; /usr/bin/python3'
Dec 02 22:53:56 np0005542928.novalocal sudo[5236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:53:56 np0005542928.novalocal python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:56 np0005542928.novalocal sudo[5236]: pam_unix(sudo:session): session closed for user root
Dec 02 22:53:56 np0005542928.novalocal sudo[5314]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keqgpvnwquqjoszehyofqhpddgsrgzjd ; /usr/bin/python3'
Dec 02 22:53:56 np0005542928.novalocal sudo[5314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:53:56 np0005542928.novalocal python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:53:56 np0005542928.novalocal sudo[5314]: pam_unix(sudo:session): session closed for user root
Dec 02 22:53:57 np0005542928.novalocal sudo[5387]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duyyloaivgasaslanpqgupmaviushxub ; /usr/bin/python3'
Dec 02 22:53:57 np0005542928.novalocal sudo[5387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:53:57 np0005542928.novalocal python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716036.3578057-28-20049875829518/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:57 np0005542928.novalocal sudo[5387]: pam_unix(sudo:session): session closed for user root
Dec 02 22:53:58 np0005542928.novalocal python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:58 np0005542928.novalocal python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:58 np0005542928.novalocal python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:58 np0005542928.novalocal python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:59 np0005542928.novalocal python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:59 np0005542928.novalocal python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:59 np0005542928.novalocal python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542928.novalocal python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542928.novalocal python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542928.novalocal python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542928.novalocal python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542928.novalocal python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542928.novalocal python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542928.novalocal python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:02 np0005542928.novalocal python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:02 np0005542928.novalocal python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:02 np0005542928.novalocal python3[5821]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542928.novalocal python3[5845]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542928.novalocal python3[5869]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542928.novalocal python3[5893]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542928.novalocal python3[5917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:04 np0005542928.novalocal python3[5941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:04 np0005542928.novalocal python3[5965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:04 np0005542928.novalocal python3[5989]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:05 np0005542928.novalocal python3[6013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:05 np0005542928.novalocal python3[6037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:07 np0005542928.novalocal sudo[6061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvodwljwgdyfyppoptbztywhgrpbinsm ; /usr/bin/python3'
Dec 02 22:54:07 np0005542928.novalocal sudo[6061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:08 np0005542928.novalocal python3[6063]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 02 22:54:08 np0005542928.novalocal systemd[1]: Starting Time & Date Service...
Dec 02 22:54:08 np0005542928.novalocal systemd[1]: Started Time & Date Service.
Dec 02 22:54:08 np0005542928.novalocal systemd-timedated[6065]: Changed time zone to 'UTC' (UTC).
Dec 02 22:54:08 np0005542928.novalocal sudo[6061]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:08 np0005542928.novalocal sudo[6092]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jihnogoofsetyblyqcjpfooijummgvmk ; /usr/bin/python3'
Dec 02 22:54:08 np0005542928.novalocal sudo[6092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:08 np0005542928.novalocal python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:08 np0005542928.novalocal sudo[6092]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:09 np0005542928.novalocal python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:09 np0005542928.novalocal python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764716048.8263373-203-19411280422685/source _original_basename=tmprb48jycd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:10 np0005542928.novalocal python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:10 np0005542928.novalocal python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764716049.7617145-244-542865870213/source _original_basename=tmpz93892r3 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:11 np0005542928.novalocal sudo[6512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzrejouyjhguzewzjocgunixijiauty ; /usr/bin/python3'
Dec 02 22:54:11 np0005542928.novalocal sudo[6512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:11 np0005542928.novalocal python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:11 np0005542928.novalocal sudo[6512]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:11 np0005542928.novalocal sudo[6585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thdjhkfpuqklpvgsecszgetawraivjrx ; /usr/bin/python3'
Dec 02 22:54:11 np0005542928.novalocal sudo[6585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:11 np0005542928.novalocal python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764716050.9322393-307-42504975301870/source _original_basename=tmp349ezna_ follow=False checksum=6c462e10cf6b935fb22f4386c31d576dcf4d4133 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:11 np0005542928.novalocal sudo[6585]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:12 np0005542928.novalocal python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:54:12 np0005542928.novalocal python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:54:12 np0005542928.novalocal sudo[6739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-borxvixdoladbidhxxdofmsbvcmokump ; /usr/bin/python3'
Dec 02 22:54:12 np0005542928.novalocal sudo[6739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:13 np0005542928.novalocal python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:13 np0005542928.novalocal sudo[6739]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:13 np0005542928.novalocal sudo[6812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmkucmwpdcawshgigvoyrotyjatxaafo ; /usr/bin/python3'
Dec 02 22:54:13 np0005542928.novalocal sudo[6812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:13 np0005542928.novalocal python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716052.751081-363-22893309946378/source _original_basename=tmp_cfue4q7 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:13 np0005542928.novalocal sudo[6812]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:13 np0005542928.novalocal sudo[6863]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxzwimxwmgifjcatggxdieaztlxzokhm ; /usr/bin/python3'
Dec 02 22:54:13 np0005542928.novalocal sudo[6863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:14 np0005542928.novalocal python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-8d85-4712-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:54:14 np0005542928.novalocal sudo[6863]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:14 np0005542928.novalocal python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-8d85-4712-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 02 22:54:16 np0005542928.novalocal python3[6921]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:33 np0005542928.novalocal sudo[6945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sffwjwohnvevhnzezljoxxnbqcggtnsa ; /usr/bin/python3'
Dec 02 22:54:33 np0005542928.novalocal sudo[6945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:33 np0005542928.novalocal python3[6947]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:33 np0005542928.novalocal sudo[6945]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:38 np0005542928.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 02 22:55:30 np0005542928.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 02 22:55:30 np0005542928.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.4916] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 22:55:30 np0005542928.novalocal systemd-udevd[6951]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5180] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5216] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5220] device (eth1): carrier: link connected
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5223] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5230] policy: auto-activating connection 'Wired connection 1' (733da19b-fb13-3701-b718-a40535d4912d)
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5235] device (eth1): Activation: starting connection 'Wired connection 1' (733da19b-fb13-3701-b718-a40535d4912d)
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5236] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5239] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5245] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 22:55:30 np0005542928.novalocal NetworkManager[859]: <info>  [1764716130.5250] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:31 np0005542928.novalocal python3[6977]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-ab69-8977-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:55:38 np0005542928.novalocal sudo[7055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzueeabdthsdadwpeboozxcndfbjzgbe ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:55:38 np0005542928.novalocal sudo[7055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:55:38 np0005542928.novalocal python3[7057]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:55:38 np0005542928.novalocal sudo[7055]: pam_unix(sudo:session): session closed for user root
Dec 02 22:55:38 np0005542928.novalocal sudo[7128]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krruegqtrvoxzobpmeohnroaiqnlmovv ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:55:38 np0005542928.novalocal sudo[7128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:55:39 np0005542928.novalocal python3[7130]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764716138.304591-151-89616029700775/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=3a60d54ecce910a84c781400a55d3593b65a59c0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:55:39 np0005542928.novalocal sudo[7128]: pam_unix(sudo:session): session closed for user root
Dec 02 22:55:39 np0005542928.novalocal sudo[7178]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wklxfioaxykjhwcjfrvurfsqwdlgjxyc ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:55:39 np0005542928.novalocal sudo[7178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:55:39 np0005542928.novalocal python3[7180]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7598] caught SIGTERM, shutting down normally.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Stopping Network Manager...
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7611] dhcp4 (eth0): canceled DHCP transaction
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7611] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7612] dhcp4 (eth0): state changed no lease
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7616] manager: NetworkManager state is now CONNECTING
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7735] dhcp4 (eth1): canceled DHCP transaction
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7735] dhcp4 (eth1): state changed no lease
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[859]: <info>  [1764716139.7822] exiting (success)
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Stopped Network Manager.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: NetworkManager.service: Consumed 1.119s CPU time, 9.9M memory peak.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Starting Network Manager...
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.8677] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f1d327b1-3d2c-4b37-9ec1-7a4c7cbc8c21)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.8682] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.8760] manager[0x563bfd13e070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Starting Hostname Service...
Dec 02 22:55:39 np0005542928.novalocal systemd[1]: Started Hostname Service.
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9865] hostname: hostname: using hostnamed
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9865] hostname: static hostname changed from (none) to "np0005542928.novalocal"
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9869] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9875] manager[0x563bfd13e070]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9875] manager[0x563bfd13e070]: rfkill: WWAN hardware radio set enabled
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9903] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9904] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9904] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9905] manager: Networking is enabled by state file
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9907] settings: Loaded settings plugin: keyfile (internal)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9911] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9933] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9942] dhcp: init: Using DHCP client 'internal'
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9945] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9950] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9955] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9962] device (lo): Activation: starting connection 'lo' (625c3601-1ec7-443e-9214-f1cc220bd16a)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9969] device (eth0): carrier: link connected
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9973] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9977] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9978] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9983] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9989] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9995] device (eth1): carrier: link connected
Dec 02 22:55:39 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716139.9999] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0003] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (733da19b-fb13-3701-b718-a40535d4912d) (indicated)
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0003] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0008] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0014] device (eth1): Activation: starting connection 'Wired connection 1' (733da19b-fb13-3701-b718-a40535d4912d)
Dec 02 22:55:40 np0005542928.novalocal systemd[1]: Started Network Manager.
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0022] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0025] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0027] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0029] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0030] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0033] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0035] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0037] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0039] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0046] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0049] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0058] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0060] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0087] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0089] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0091] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0106] device (lo): Activation: successful, device activated.
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0126] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 22:55:40 np0005542928.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0217] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0282] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0285] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0290] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0296] device (eth0): Activation: successful, device activated.
Dec 02 22:55:40 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716140.0303] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 22:55:40 np0005542928.novalocal sudo[7178]: pam_unix(sudo:session): session closed for user root
Dec 02 22:55:40 np0005542928.novalocal python3[7265]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-ab69-8977-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:55:50 np0005542928.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 22:56:10 np0005542928.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 22:56:18 np0005542928.novalocal systemd[4303]: Starting Mark boot as successful...
Dec 02 22:56:18 np0005542928.novalocal systemd[4303]: Finished Mark boot as successful.
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.2879] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 22:56:25 np0005542928.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 22:56:25 np0005542928.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3254] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3258] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3273] device (eth1): Activation: successful, device activated.
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3283] manager: startup complete
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3286] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <warn>  [1764716185.3297] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3312] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3442] dhcp4 (eth1): canceled DHCP transaction
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3444] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3444] dhcp4 (eth1): state changed no lease
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3471] policy: auto-activating connection 'ci-private-network' (5b544c7d-595f-5c85-b896-4057860a4650)
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3479] device (eth1): Activation: starting connection 'ci-private-network' (5b544c7d-595f-5c85-b896-4057860a4650)
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3480] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3486] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3496] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3511] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3577] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3580] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 22:56:25 np0005542928.novalocal NetworkManager[7188]: <info>  [1764716185.3592] device (eth1): Activation: successful, device activated.
Dec 02 22:56:35 np0005542928.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 22:56:40 np0005542928.novalocal sshd-session[4313]: Received disconnect from 38.102.83.114 port 35692:11: disconnected by user
Dec 02 22:56:40 np0005542928.novalocal sshd-session[4313]: Disconnected from user zuul 38.102.83.114 port 35692
Dec 02 22:56:40 np0005542928.novalocal sshd-session[4299]: pam_unix(sshd:session): session closed for user zuul
Dec 02 22:56:40 np0005542928.novalocal systemd-logind[790]: Session 1 logged out. Waiting for processes to exit.
Dec 02 22:56:43 np0005542928.novalocal sshd-session[7294]: Accepted publickey for zuul from 38.102.83.114 port 45558 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 22:56:43 np0005542928.novalocal systemd-logind[790]: New session 3 of user zuul.
Dec 02 22:56:43 np0005542928.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 02 22:56:43 np0005542928.novalocal sshd-session[7294]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 22:56:43 np0005542928.novalocal sudo[7373]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrzzwrbwvixtzprivcfkyxtngcyizrek ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:56:43 np0005542928.novalocal sudo[7373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:56:43 np0005542928.novalocal python3[7375]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:56:43 np0005542928.novalocal sudo[7373]: pam_unix(sudo:session): session closed for user root
Dec 02 22:56:43 np0005542928.novalocal sudo[7446]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjcbxcuuvvgiewikdgsufmeconufwyok ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:56:43 np0005542928.novalocal sudo[7446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:56:44 np0005542928.novalocal python3[7448]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716203.2799304-309-155199615620166/source _original_basename=tmpseg8x9jr follow=False checksum=e4fb491a69cfdde53e5aa8dc33e934e0cc50f41b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:56:44 np0005542928.novalocal sudo[7446]: pam_unix(sudo:session): session closed for user root
Dec 02 22:56:46 np0005542928.novalocal sshd-session[7297]: Connection closed by 38.102.83.114 port 45558
Dec 02 22:56:46 np0005542928.novalocal sshd-session[7294]: pam_unix(sshd:session): session closed for user zuul
Dec 02 22:56:46 np0005542928.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 02 22:56:46 np0005542928.novalocal systemd-logind[790]: Session 3 logged out. Waiting for processes to exit.
Dec 02 22:56:46 np0005542928.novalocal systemd-logind[790]: Removed session 3.
Dec 02 22:59:16 np0005542928.novalocal sshd-session[7474]: Connection reset by 134.209.86.24 port 21637 [preauth]
Dec 02 22:59:18 np0005542928.novalocal systemd[4303]: Created slice User Background Tasks Slice.
Dec 02 22:59:18 np0005542928.novalocal systemd[4303]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 22:59:18 np0005542928.novalocal systemd[4303]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 23:00:00 np0005542928.novalocal sshd-session[7478]: Received disconnect from 193.46.255.33 port 10900:11:  [preauth]
Dec 02 23:00:00 np0005542928.novalocal sshd-session[7478]: Disconnected from authenticating user root 193.46.255.33 port 10900 [preauth]
Dec 02 23:01:01 np0005542928.novalocal CROND[7482]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 23:01:01 np0005542928.novalocal run-parts[7485]: (/etc/cron.hourly) starting 0anacron
Dec 02 23:01:01 np0005542928.novalocal anacron[7493]: Anacron started on 2025-12-02
Dec 02 23:01:01 np0005542928.novalocal anacron[7493]: Will run job `cron.daily' in 22 min.
Dec 02 23:01:01 np0005542928.novalocal anacron[7493]: Will run job `cron.weekly' in 42 min.
Dec 02 23:01:01 np0005542928.novalocal anacron[7493]: Will run job `cron.monthly' in 62 min.
Dec 02 23:01:01 np0005542928.novalocal anacron[7493]: Jobs will be executed sequentially
Dec 02 23:01:01 np0005542928.novalocal run-parts[7495]: (/etc/cron.hourly) finished 0anacron
Dec 02 23:01:01 np0005542928.novalocal CROND[7481]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 23:03:05 np0005542928.novalocal sshd-session[7497]: Accepted publickey for zuul from 38.102.83.114 port 57140 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:03:05 np0005542928.novalocal systemd-logind[790]: New session 4 of user zuul.
Dec 02 23:03:05 np0005542928.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 02 23:03:05 np0005542928.novalocal sshd-session[7497]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:03:06 np0005542928.novalocal sudo[7524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wghjcvbkztniimdrjliigszqtanhtxga ; /usr/bin/python3'
Dec 02 23:03:06 np0005542928.novalocal sudo[7524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:06 np0005542928.novalocal python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-3fa7-c4d8-000000001cd7-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:06 np0005542928.novalocal sudo[7524]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:06 np0005542928.novalocal sudo[7552]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwqauasijfzlyygyjoduyzpsqmipfygm ; /usr/bin/python3'
Dec 02 23:03:06 np0005542928.novalocal sudo[7552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:06 np0005542928.novalocal python3[7554]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:06 np0005542928.novalocal sudo[7552]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:06 np0005542928.novalocal sudo[7579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvnsrlldukdoiycchvitedtdkgzywsdx ; /usr/bin/python3'
Dec 02 23:03:06 np0005542928.novalocal sudo[7579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:06 np0005542928.novalocal python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:06 np0005542928.novalocal sudo[7579]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:06 np0005542928.novalocal sudo[7605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyulgouobngycfwjhgmuzrslrjbwbdpw ; /usr/bin/python3'
Dec 02 23:03:06 np0005542928.novalocal sudo[7605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:07 np0005542928.novalocal python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:07 np0005542928.novalocal sudo[7605]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:07 np0005542928.novalocal sudo[7631]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohhqroowehzgunixenmqwskmvbycfufl ; /usr/bin/python3'
Dec 02 23:03:07 np0005542928.novalocal sudo[7631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:07 np0005542928.novalocal python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:07 np0005542928.novalocal sudo[7631]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:07 np0005542928.novalocal sudo[7657]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oypkkfhjpxgrrxkyejcwhuhrjecdvunu ; /usr/bin/python3'
Dec 02 23:03:07 np0005542928.novalocal sudo[7657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:08 np0005542928.novalocal python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:08 np0005542928.novalocal sudo[7657]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:08 np0005542928.novalocal sudo[7735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uybijbpakphomqdwwbeckyhkixkhuqls ; /usr/bin/python3'
Dec 02 23:03:08 np0005542928.novalocal sudo[7735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:08 np0005542928.novalocal python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:03:08 np0005542928.novalocal sudo[7735]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:08 np0005542928.novalocal sudo[7808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfbpjeqghohrqcowlgfjlicwspiyawpr ; /usr/bin/python3'
Dec 02 23:03:08 np0005542928.novalocal sudo[7808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:08 np0005542928.novalocal python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716588.2636256-496-278095854116031/source _original_basename=tmpxzepq3fu follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:08 np0005542928.novalocal sudo[7808]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:09 np0005542928.novalocal sudo[7858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdpgvivxzngsakswmddxkguakchpnbjl ; /usr/bin/python3'
Dec 02 23:03:09 np0005542928.novalocal sudo[7858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:10 np0005542928.novalocal python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:03:10 np0005542928.novalocal systemd[1]: Reloading.
Dec 02 23:03:10 np0005542928.novalocal systemd-rc-local-generator[7883]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:03:10 np0005542928.novalocal sudo[7858]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:11 np0005542928.novalocal sudo[7914]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvqpgfxhmxqhfmqptsjpcehvnfndveur ; /usr/bin/python3'
Dec 02 23:03:11 np0005542928.novalocal sudo[7914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:11 np0005542928.novalocal python3[7916]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 02 23:03:11 np0005542928.novalocal sudo[7914]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:11 np0005542928.novalocal sudo[7940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugcjmjsgrikvlzkhlssvnubfentehfun ; /usr/bin/python3'
Dec 02 23:03:11 np0005542928.novalocal sudo[7940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542928.novalocal python3[7942]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542928.novalocal sudo[7940]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:12 np0005542928.novalocal sudo[7968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhgfsbhefgzgzkkpjchvndnqlnqrqmdo ; /usr/bin/python3'
Dec 02 23:03:12 np0005542928.novalocal sudo[7968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542928.novalocal python3[7970]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542928.novalocal sudo[7968]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:12 np0005542928.novalocal sudo[7996]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcqkzwdcpgywrvyvylhbrpwgbcdehjab ; /usr/bin/python3'
Dec 02 23:03:12 np0005542928.novalocal sudo[7996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542928.novalocal python3[7998]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542928.novalocal sudo[7996]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:12 np0005542928.novalocal sudo[8024]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmudkqylzddguaomsidgpdwogyixzfwt ; /usr/bin/python3'
Dec 02 23:03:12 np0005542928.novalocal sudo[8024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542928.novalocal python3[8026]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542928.novalocal sudo[8024]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:13 np0005542928.novalocal python3[8053]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-3fa7-c4d8-000000001cde-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:14 np0005542928.novalocal python3[8083]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 23:03:16 np0005542928.novalocal sshd-session[7500]: Connection closed by 38.102.83.114 port 57140
Dec 02 23:03:16 np0005542928.novalocal sshd-session[7497]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:03:16 np0005542928.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 02 23:03:16 np0005542928.novalocal systemd[1]: session-4.scope: Consumed 4.359s CPU time.
Dec 02 23:03:16 np0005542928.novalocal systemd-logind[790]: Session 4 logged out. Waiting for processes to exit.
Dec 02 23:03:16 np0005542928.novalocal systemd-logind[790]: Removed session 4.
Dec 02 23:03:18 np0005542928.novalocal sshd-session[8087]: Accepted publickey for zuul from 38.102.83.114 port 60304 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:03:18 np0005542928.novalocal systemd-logind[790]: New session 5 of user zuul.
Dec 02 23:03:18 np0005542928.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 02 23:03:18 np0005542928.novalocal sshd-session[8087]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:03:18 np0005542928.novalocal sudo[8114]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnlaidbnsbdvakjgimzxuxysejgcovox ; /usr/bin/python3'
Dec 02 23:03:18 np0005542928.novalocal sudo[8114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:18 np0005542928.novalocal python3[8116]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:03:32 np0005542928.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:03:41 np0005542928.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:03:52 np0005542928.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:03:53 np0005542928.novalocal setsebool[8178]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 02 23:03:53 np0005542928.novalocal setsebool[8178]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  Converting 389 SID table entries...
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:04:06 np0005542928.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:04:23 np0005542928.novalocal dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 02 23:04:23 np0005542928.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:04:24 np0005542928.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:04:24 np0005542928.novalocal systemd[1]: Reloading.
Dec 02 23:04:24 np0005542928.novalocal systemd-rc-local-generator[8932]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:04:24 np0005542928.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:04:25 np0005542928.novalocal sudo[8114]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:26 np0005542928.novalocal python3[10059]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-6827-230b-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:04:26 np0005542928.novalocal kernel: evm: overlay not supported
Dec 02 23:04:26 np0005542928.novalocal systemd[4303]: Starting D-Bus User Message Bus...
Dec 02 23:04:26 np0005542928.novalocal dbus-broker-launch[10897]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 02 23:04:26 np0005542928.novalocal dbus-broker-launch[10897]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 02 23:04:26 np0005542928.novalocal systemd[4303]: Started D-Bus User Message Bus.
Dec 02 23:04:27 np0005542928.novalocal dbus-broker-lau[10897]: Ready
Dec 02 23:04:27 np0005542928.novalocal systemd[4303]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 02 23:04:27 np0005542928.novalocal systemd[4303]: Created slice Slice /user.
Dec 02 23:04:27 np0005542928.novalocal systemd[4303]: podman-10762.scope: unit configures an IP firewall, but not running as root.
Dec 02 23:04:27 np0005542928.novalocal systemd[4303]: (This warning is only shown for the first unit using IP firewalling.)
Dec 02 23:04:27 np0005542928.novalocal systemd[4303]: Started podman-10762.scope.
Dec 02 23:04:27 np0005542928.novalocal systemd[4303]: Started podman-pause-a6a0bc3e.scope.
Dec 02 23:04:27 np0005542928.novalocal sudo[11643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxtajuiftmhiayrtzoysiyhdlvntxibh ; /usr/bin/python3'
Dec 02 23:04:27 np0005542928.novalocal sudo[11643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:27 np0005542928.novalocal python3[11669]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.2:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.2:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:04:27 np0005542928.novalocal python3[11669]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 02 23:04:27 np0005542928.novalocal sudo[11643]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:28 np0005542928.novalocal sshd-session[8090]: Connection closed by 38.102.83.114 port 60304
Dec 02 23:04:28 np0005542928.novalocal sshd-session[8087]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:04:28 np0005542928.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 02 23:04:28 np0005542928.novalocal systemd[1]: session-5.scope: Consumed 1min 4.234s CPU time.
Dec 02 23:04:28 np0005542928.novalocal systemd-logind[790]: Session 5 logged out. Waiting for processes to exit.
Dec 02 23:04:28 np0005542928.novalocal systemd-logind[790]: Removed session 5.
Dec 02 23:04:35 np0005542928.novalocal sshd-session[14603]: Invalid user sshadmin from 185.156.73.233 port 30526
Dec 02 23:04:36 np0005542928.novalocal sshd-session[14603]: Connection closed by invalid user sshadmin 185.156.73.233 port 30526 [preauth]
Dec 02 23:04:36 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 02 23:04:36 np0005542928.novalocal irqbalance[787]: IRQ 27 affinity is now unmanaged
Dec 02 23:04:47 np0005542928.novalocal sshd-session[19760]: Unable to negotiate with 38.102.83.66 port 51846: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 02 23:04:47 np0005542928.novalocal sshd-session[19765]: Unable to negotiate with 38.102.83.66 port 51826: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 02 23:04:47 np0005542928.novalocal sshd-session[19757]: Connection closed by 38.102.83.66 port 51808 [preauth]
Dec 02 23:04:47 np0005542928.novalocal sshd-session[19761]: Connection closed by 38.102.83.66 port 51812 [preauth]
Dec 02 23:04:47 np0005542928.novalocal sshd-session[19767]: Unable to negotiate with 38.102.83.66 port 51832: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 02 23:04:52 np0005542928.novalocal sshd-session[21372]: Accepted publickey for zuul from 38.102.83.114 port 42124 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:04:52 np0005542928.novalocal systemd-logind[790]: New session 6 of user zuul.
Dec 02 23:04:52 np0005542928.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 02 23:04:52 np0005542928.novalocal sshd-session[21372]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:04:52 np0005542928.novalocal python3[21498]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN59DNT3Ni5luGimbJB902j8ywAXk/V0moDqx3ShASHiCOzoT242Be+x+X2vIUoDwfIddRBT8pqsU1aeIxWrMFc= zuul@np0005542926.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 23:04:52 np0005542928.novalocal sudo[21681]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbndeyifcpbpglydcsaeefhfgonujinn ; /usr/bin/python3'
Dec 02 23:04:52 np0005542928.novalocal sudo[21681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:53 np0005542928.novalocal python3[21699]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN59DNT3Ni5luGimbJB902j8ywAXk/V0moDqx3ShASHiCOzoT242Be+x+X2vIUoDwfIddRBT8pqsU1aeIxWrMFc= zuul@np0005542926.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 23:04:53 np0005542928.novalocal sudo[21681]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:53 np0005542928.novalocal sudo[22062]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhuvxjmrnhkfxztjezdejplpcpjbkkjx ; /usr/bin/python3'
Dec 02 23:04:53 np0005542928.novalocal sudo[22062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:53 np0005542928.novalocal python3[22072]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005542928.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 02 23:04:53 np0005542928.novalocal useradd[22150]: new group: name=cloud-admin, GID=1002
Dec 02 23:04:53 np0005542928.novalocal useradd[22150]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 02 23:04:54 np0005542928.novalocal sudo[22062]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:54 np0005542928.novalocal sudo[22290]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aatgbhmkfabxfpevdfwsuycyxttxqjjw ; /usr/bin/python3'
Dec 02 23:04:54 np0005542928.novalocal sudo[22290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:54 np0005542928.novalocal python3[22298]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN59DNT3Ni5luGimbJB902j8ywAXk/V0moDqx3ShASHiCOzoT242Be+x+X2vIUoDwfIddRBT8pqsU1aeIxWrMFc= zuul@np0005542926.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 23:04:54 np0005542928.novalocal sudo[22290]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:54 np0005542928.novalocal sudo[22554]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqdjxisqhubzdpdtswwmhpmfzrmpufq ; /usr/bin/python3'
Dec 02 23:04:54 np0005542928.novalocal sudo[22554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:54 np0005542928.novalocal python3[22560]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:04:54 np0005542928.novalocal sudo[22554]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:55 np0005542928.novalocal sudo[22828]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mindzxehxtjjifzuymtmqfxgenlzftat ; /usr/bin/python3'
Dec 02 23:04:55 np0005542928.novalocal sudo[22828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:55 np0005542928.novalocal python3[22838]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716694.5947828-152-126997646942329/source _original_basename=tmpznk23_5w follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:04:55 np0005542928.novalocal sudo[22828]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:56 np0005542928.novalocal sudo[23125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kizifzdlnuwbsiqqisahbsegkbdpcrxa ; /usr/bin/python3'
Dec 02 23:04:56 np0005542928.novalocal sudo[23125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:56 np0005542928.novalocal irqbalance[787]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 02 23:04:56 np0005542928.novalocal irqbalance[787]: IRQ 26 affinity is now unmanaged
Dec 02 23:04:56 np0005542928.novalocal python3[23136]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec 02 23:04:56 np0005542928.novalocal systemd[1]: Starting Hostname Service...
Dec 02 23:04:56 np0005542928.novalocal systemd[1]: Started Hostname Service.
Dec 02 23:04:56 np0005542928.novalocal systemd-hostnamed[23234]: Changed pretty hostname to 'compute-1'
Dec 02 23:04:56 compute-1 systemd-hostnamed[23234]: Hostname set to <compute-1> (static)
Dec 02 23:04:56 compute-1 NetworkManager[7188]: <info>  [1764716696.5043] hostname: static hostname changed from "np0005542928.novalocal" to "compute-1"
Dec 02 23:04:56 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 23:04:56 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 23:04:56 compute-1 sudo[23125]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:57 compute-1 sshd-session[21433]: Connection closed by 38.102.83.114 port 42124
Dec 02 23:04:57 compute-1 sshd-session[21372]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:04:57 compute-1 systemd-logind[790]: Session 6 logged out. Waiting for processes to exit.
Dec 02 23:04:57 compute-1 systemd[1]: session-6.scope: Deactivated successfully.
Dec 02 23:04:57 compute-1 systemd[1]: session-6.scope: Consumed 2.452s CPU time.
Dec 02 23:04:57 compute-1 systemd-logind[790]: Removed session 6.
Dec 02 23:05:06 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 23:05:20 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:05:20 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:05:20 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 7.996s CPU time.
Dec 02 23:05:20 compute-1 systemd[1]: run-rb342eb6f2f9748b0b48e3cc0c28e6eb1.service: Deactivated successfully.
Dec 02 23:05:26 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 23:07:45 compute-1 sshd-session[29935]: Received disconnect from 193.46.255.244 port 56458:11:  [preauth]
Dec 02 23:07:45 compute-1 sshd-session[29935]: Disconnected from authenticating user root 193.46.255.244 port 56458 [preauth]
Dec 02 23:08:18 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 02 23:08:18 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 02 23:08:18 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 02 23:08:18 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 02 23:08:38 compute-1 sshd-session[29940]: Accepted publickey for zuul from 38.102.83.66 port 39388 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:08:38 compute-1 systemd-logind[790]: New session 7 of user zuul.
Dec 02 23:08:38 compute-1 systemd[1]: Started Session 7 of User zuul.
Dec 02 23:08:38 compute-1 sshd-session[29940]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:08:38 compute-1 python3[30016]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:08:41 compute-1 sudo[30130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylcqtuttkdvkgookjnyzvwkxifkvoqyz ; /usr/bin/python3'
Dec 02 23:08:41 compute-1 sudo[30130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:41 compute-1 python3[30132]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:41 compute-1 sudo[30130]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:42 compute-1 sudo[30203]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkglzqhyggvivcsseoipnntupgbjegaw ; /usr/bin/python3'
Dec 02 23:08:42 compute-1 sudo[30203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:42 compute-1 python3[30205]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=delorean.repo follow=False checksum=411ac78a3f8a50f4fad8cedb733e290aaaf7f3f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:42 compute-1 sudo[30203]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:42 compute-1 sudo[30229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyxnthngxbhoxvjcorgmeuzxxtluvbet ; /usr/bin/python3'
Dec 02 23:08:42 compute-1 sudo[30229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:42 compute-1 python3[30231]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:42 compute-1 sudo[30229]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:42 compute-1 sudo[30302]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhykajvxltqvjdtpmffvydgvqjgrqnbs ; /usr/bin/python3'
Dec 02 23:08:42 compute-1 sudo[30302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:42 compute-1 python3[30304]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:42 compute-1 sudo[30302]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:43 compute-1 sudo[30328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxtkqbhgyqahwyotsnycyspiyruymhfy ; /usr/bin/python3'
Dec 02 23:08:43 compute-1 sudo[30328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-1 python3[30330]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:43 compute-1 sudo[30328]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:43 compute-1 sudo[30401]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqaariuyqtpmwgvocsnmbpkrklxxqicf ; /usr/bin/python3'
Dec 02 23:08:43 compute-1 sudo[30401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-1 python3[30403]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:43 compute-1 sudo[30401]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:43 compute-1 sudo[30427]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsbxhzrvdfzamjlulhvwjzsbbzpkxvnm ; /usr/bin/python3'
Dec 02 23:08:43 compute-1 sudo[30427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-1 python3[30429]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:43 compute-1 sudo[30427]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:44 compute-1 sudo[30500]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flwnayurhnworosqeoavjoadgmxpotug ; /usr/bin/python3'
Dec 02 23:08:44 compute-1 sudo[30500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:44 compute-1 python3[30502]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:44 compute-1 sudo[30500]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:44 compute-1 sudo[30526]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wftmqjayhliuwkeqkwvthzzpoczgkncf ; /usr/bin/python3'
Dec 02 23:08:44 compute-1 sudo[30526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:44 compute-1 python3[30528]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:44 compute-1 sudo[30526]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:44 compute-1 sudo[30599]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqknexhwratimzuxmrayorzuuxandlmr ; /usr/bin/python3'
Dec 02 23:08:44 compute-1 sudo[30599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:44 compute-1 python3[30601]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:44 compute-1 sudo[30599]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:44 compute-1 sudo[30625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yusffcprrfzohzpugbkveznclxeckyiw ; /usr/bin/python3'
Dec 02 23:08:44 compute-1 sudo[30625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:45 compute-1 python3[30627]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:45 compute-1 sudo[30625]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:45 compute-1 sudo[30698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfiwzyqwotwevvjurovodiopiujeufob ; /usr/bin/python3'
Dec 02 23:08:45 compute-1 sudo[30698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:45 compute-1 python3[30700]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:45 compute-1 sudo[30698]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:45 compute-1 sudo[30724]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrkjkdesgcxazvloahzrorhqasgszyma ; /usr/bin/python3'
Dec 02 23:08:45 compute-1 sudo[30724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:45 compute-1 python3[30726]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:45 compute-1 sudo[30724]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:46 compute-1 sudo[30797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgmrnoanuqjjrncizlfcgkvzlvyvoiai ; /usr/bin/python3'
Dec 02 23:08:46 compute-1 sudo[30797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:46 compute-1 python3[30799]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4344854-33764-208089961469957/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=fa2c662325f345c065cf09a4d87ff5b21ab5eb35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:46 compute-1 sudo[30797]: pam_unix(sudo:session): session closed for user root
Dec 02 23:09:44 compute-1 python3[30847]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:13:25 compute-1 sshd-session[30853]: error: kex_exchange_identification: read: Connection reset by peer
Dec 02 23:13:25 compute-1 sshd-session[30853]: Connection reset by 45.140.17.97 port 32754
Dec 02 23:14:22 compute-1 sshd-session[30854]: Received disconnect from 193.46.255.33 port 54954:11:  [preauth]
Dec 02 23:14:22 compute-1 sshd-session[30854]: Disconnected from authenticating user root 193.46.255.33 port 54954 [preauth]
Dec 02 23:14:44 compute-1 sshd-session[29943]: Received disconnect from 38.102.83.66 port 39388:11: disconnected by user
Dec 02 23:14:44 compute-1 sshd-session[29943]: Disconnected from user zuul 38.102.83.66 port 39388
Dec 02 23:14:44 compute-1 sshd-session[29940]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:14:44 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Dec 02 23:14:44 compute-1 systemd[1]: session-7.scope: Consumed 5.404s CPU time.
Dec 02 23:14:44 compute-1 systemd-logind[790]: Session 7 logged out. Waiting for processes to exit.
Dec 02 23:14:44 compute-1 systemd-logind[790]: Removed session 7.
Dec 02 23:15:53 compute-1 sshd-session[30856]: Connection closed by authenticating user root 185.156.73.233 port 50586 [preauth]
Dec 02 23:21:43 compute-1 sshd-session[30860]: Accepted publickey for zuul from 192.168.122.30 port 37042 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:21:43 compute-1 systemd-logind[790]: New session 8 of user zuul.
Dec 02 23:21:43 compute-1 systemd[1]: Started Session 8 of User zuul.
Dec 02 23:21:43 compute-1 sshd-session[30860]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:21:44 compute-1 python3.9[31013]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:21:45 compute-1 sudo[31192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hscnzhryyrxydlcqgrptsovpuchzcwqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717705.3395581-46-203234824979927/AnsiballZ_command.py'
Dec 02 23:21:45 compute-1 sudo[31192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:21:45 compute-1 python3.9[31194]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:21:53 compute-1 sudo[31192]: pam_unix(sudo:session): session closed for user root
Dec 02 23:21:53 compute-1 sshd-session[30863]: Connection closed by 192.168.122.30 port 37042
Dec 02 23:21:53 compute-1 sshd-session[30860]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:21:53 compute-1 systemd-logind[790]: Session 8 logged out. Waiting for processes to exit.
Dec 02 23:21:53 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Dec 02 23:21:53 compute-1 systemd[1]: session-8.scope: Consumed 8.210s CPU time.
Dec 02 23:21:53 compute-1 systemd-logind[790]: Removed session 8.
Dec 02 23:21:58 compute-1 sshd-session[31251]: Accepted publickey for zuul from 192.168.122.30 port 34262 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:21:58 compute-1 systemd-logind[790]: New session 9 of user zuul.
Dec 02 23:21:58 compute-1 systemd[1]: Started Session 9 of User zuul.
Dec 02 23:21:58 compute-1 sshd-session[31251]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:22:00 compute-1 python3.9[31404]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:00 compute-1 sshd-session[31254]: Connection closed by 192.168.122.30 port 34262
Dec 02 23:22:00 compute-1 sshd-session[31251]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:22:00 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Dec 02 23:22:00 compute-1 systemd-logind[790]: Session 9 logged out. Waiting for processes to exit.
Dec 02 23:22:00 compute-1 systemd-logind[790]: Removed session 9.
Dec 02 23:22:16 compute-1 sshd-session[31432]: Accepted publickey for zuul from 192.168.122.30 port 58328 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:22:16 compute-1 systemd-logind[790]: New session 10 of user zuul.
Dec 02 23:22:16 compute-1 systemd[1]: Started Session 10 of User zuul.
Dec 02 23:22:16 compute-1 sshd-session[31432]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:22:17 compute-1 python3.9[31585]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 02 23:22:18 compute-1 python3.9[31759]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:19 compute-1 sudo[31909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrknlbagiaismvlogeavgkonaenyxnvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717738.5605543-70-118298259267418/AnsiballZ_command.py'
Dec 02 23:22:19 compute-1 sudo[31909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:19 compute-1 python3.9[31911]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:22:19 compute-1 sudo[31909]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:20 compute-1 sudo[32062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuppxswuspwmzjntxdnkwwbhzxdurkhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717739.6693628-94-225719889874327/AnsiballZ_stat.py'
Dec 02 23:22:20 compute-1 sudo[32062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:20 compute-1 python3.9[32064]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:22:20 compute-1 sudo[32062]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:21 compute-1 sudo[32214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhmrfrfmvwmrtcarszzpfkxufiucdmca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717740.5518064-110-114605172424929/AnsiballZ_file.py'
Dec 02 23:22:21 compute-1 sudo[32214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:21 compute-1 python3.9[32216]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:22:21 compute-1 sudo[32214]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:21 compute-1 sudo[32366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoixwnswojdnnssptmzrynebaumxhjtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717741.4483092-126-223486573130605/AnsiballZ_stat.py'
Dec 02 23:22:21 compute-1 sudo[32366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:21 compute-1 python3.9[32368]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:22:22 compute-1 sudo[32366]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:22 compute-1 sudo[32489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbxonvwfvwvnmxvyaykvrstxqrogrnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717741.4483092-126-223486573130605/AnsiballZ_copy.py'
Dec 02 23:22:22 compute-1 sudo[32489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:22 compute-1 python3.9[32491]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764717741.4483092-126-223486573130605/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:22:22 compute-1 sudo[32489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:23 compute-1 sudo[32641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-timgnxubbcgxbwbwazxtfadidlqecjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717742.8995998-156-94738914418155/AnsiballZ_setup.py'
Dec 02 23:22:23 compute-1 sudo[32641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:23 compute-1 python3.9[32643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:23 compute-1 sudo[32641]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:24 compute-1 sudo[32797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yffbvzgnypguskueuyhqhxzfdshcabry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717743.939908-172-216799115107114/AnsiballZ_file.py'
Dec 02 23:22:24 compute-1 sudo[32797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:24 compute-1 python3.9[32799]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:22:24 compute-1 sudo[32797]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:25 compute-1 sudo[32949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlxatizehexbiwjnhxlvffyzcrcmbflw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717744.8055484-190-103544191289773/AnsiballZ_file.py'
Dec 02 23:22:25 compute-1 sudo[32949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:25 compute-1 python3.9[32951]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:22:25 compute-1 sudo[32949]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:26 compute-1 python3.9[33101]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:22:29 compute-1 sshd-session[33180]: Received disconnect from 80.94.93.233 port 64972:11:  [preauth]
Dec 02 23:22:29 compute-1 sshd-session[33180]: Disconnected from authenticating user root 80.94.93.233 port 64972 [preauth]
Dec 02 23:22:30 compute-1 python3.9[33356]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:22:32 compute-1 python3.9[33506]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:33 compute-1 python3.9[33660]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:34 compute-1 sudo[33817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abymsfvabgckaeoswlvoxgeuqjbsdeqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717754.0633528-286-203011558947564/AnsiballZ_setup.py'
Dec 02 23:22:34 compute-1 sudo[33817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:34 compute-1 python3.9[33819]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:22:34 compute-1 sudo[33817]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:35 compute-1 sudo[33901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxejfnmyaqroculxfyidnlspjmxbuxxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717754.0633528-286-203011558947564/AnsiballZ_dnf.py'
Dec 02 23:22:35 compute-1 sudo[33901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:35 compute-1 python3.9[33903]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:23:01 compute-1 anacron[7493]: Job `cron.daily' started
Dec 02 23:23:01 compute-1 anacron[7493]: Job `cron.daily' terminated
Dec 02 23:23:18 compute-1 systemd[1]: Reloading.
Dec 02 23:23:18 compute-1 systemd-rc-local-generator[34099]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:23:18 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 02 23:23:19 compute-1 systemd[1]: Reloading.
Dec 02 23:23:19 compute-1 systemd-rc-local-generator[34141]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:23:19 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 02 23:23:19 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 02 23:23:19 compute-1 systemd[1]: Reloading.
Dec 02 23:23:19 compute-1 systemd-rc-local-generator[34178]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:23:19 compute-1 systemd[1]: Starting dnf makecache...
Dec 02 23:23:19 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 02 23:23:19 compute-1 dnf[34190]: Failed determining last makecache time.
Dec 02 23:23:19 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:23:19 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:23:19 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-openstack-barbican-42b4c41831408a8e323 164 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 194 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-openstack-cinder-1c00d6490d88e436f26ef 194 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-python-stevedore-c4acc5639fd2329372142 170 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-python-cloudkitty-tests-tempest-2c80f8 151 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 194 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 194 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-python-designate-tests-tempest-347fdbc 198 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-openstack-glance-1fd12c29b339f30fe823e 180 kB/s | 3.0 kB     00:00
Dec 02 23:23:19 compute-1 dnf[34190]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 190 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-openstack-manila-3c01b7181572c95dac462 194 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-python-whitebox-neutron-tests-tempest- 191 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-openstack-octavia-ba397f07a7331190208c 177 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-openstack-watcher-c014f81a8647287f6dcc 192 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-ansible-config_template-5ccaa22121a7ff 189 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 191 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-openstack-swift-dc98a8463506ac520c469a 185 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-python-tempestconf-8515371b7cceebd4282 197 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: delorean-openstack-heat-ui-013accbfd179753bc3f0 201 kB/s | 3.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: CentOS Stream 9 - BaseOS                         53 kB/s | 5.9 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: CentOS Stream 9 - AppStream                      60 kB/s | 6.0 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: CentOS Stream 9 - CRB                            25 kB/s | 5.8 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: CentOS Stream 9 - Extras packages                64 kB/s | 8.3 kB     00:00
Dec 02 23:23:20 compute-1 dnf[34190]: dlrn-antelope-testing                            87 kB/s | 3.0 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: dlrn-antelope-build-deps                         91 kB/s | 3.0 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: centos9-rabbitmq                                103 kB/s | 3.0 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: centos9-storage                                 122 kB/s | 3.0 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: centos9-opstools                                119 kB/s | 3.0 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: NFV SIG OpenvSwitch                             109 kB/s | 3.0 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: repo-setup-centos-appstream                     167 kB/s | 4.4 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: repo-setup-centos-baseos                        170 kB/s | 3.9 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: repo-setup-centos-highavailability              171 kB/s | 3.9 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: repo-setup-centos-powertools                    195 kB/s | 4.3 kB     00:00
Dec 02 23:23:21 compute-1 dnf[34190]: Extra Packages for Enterprise Linux 9 - x86_64  286 kB/s |  33 kB     00:00
Dec 02 23:23:22 compute-1 dnf[34190]: Metadata cache created.
Dec 02 23:23:22 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 02 23:23:22 compute-1 systemd[1]: Finished dnf makecache.
Dec 02 23:23:22 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.684s CPU time.
Dec 02 23:24:20 compute-1 kernel: SELinux:  Converting 2719 SID table entries...
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:24:20 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:24:21 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 02 23:24:21 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:24:21 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:24:21 compute-1 systemd[1]: Reloading.
Dec 02 23:24:21 compute-1 systemd-rc-local-generator[34557]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:24:21 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:24:21 compute-1 sudo[33901]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:22 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:24:22 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:24:22 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.152s CPU time.
Dec 02 23:24:22 compute-1 systemd[1]: run-rcd648d6e1c954fcebae9ba0eb2d5878d.service: Deactivated successfully.
Dec 02 23:24:22 compute-1 sudo[35476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qelyrzlitvgakumxalkqlbbozarpeqpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717862.0463529-310-150085759081659/AnsiballZ_command.py'
Dec 02 23:24:22 compute-1 sudo[35476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:22 compute-1 python3.9[35478]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:24:23 compute-1 sudo[35476]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:24 compute-1 sudo[35757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdcaralgzzzitntcauoxldhhkvubzfsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717863.5472217-326-180259713368866/AnsiballZ_selinux.py'
Dec 02 23:24:24 compute-1 sudo[35757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:24 compute-1 python3.9[35759]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 02 23:24:24 compute-1 sudo[35757]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:25 compute-1 sudo[35909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqfvbsagouzxyxwlkqwdxmwlsfelemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717864.927683-348-58710219917487/AnsiballZ_command.py'
Dec 02 23:24:25 compute-1 sudo[35909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:25 compute-1 python3.9[35911]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 02 23:24:26 compute-1 sudo[35909]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:27 compute-1 sudo[36062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwfeohtfakpacjswkzrqesazsaamujil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717866.7254558-364-1161383169323/AnsiballZ_file.py'
Dec 02 23:24:27 compute-1 sudo[36062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:27 compute-1 python3.9[36064]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:24:27 compute-1 sudo[36062]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:29 compute-1 sudo[36214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fypqggmvnmqdrfmvtdcpovipvtxfcmpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717868.6525533-381-67990243273106/AnsiballZ_mount.py'
Dec 02 23:24:29 compute-1 sudo[36214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:29 compute-1 python3.9[36216]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 02 23:24:29 compute-1 sudo[36214]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:30 compute-1 sudo[36366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rylhtbhjhkpuonvkbtlwjqbvctosqlyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717870.3291335-437-249382962383994/AnsiballZ_file.py'
Dec 02 23:24:30 compute-1 sudo[36366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:30 compute-1 python3.9[36368]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:30 compute-1 sudo[36366]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:31 compute-1 sudo[36518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duowitwamqbixfwqgrmdncfwjlnxifth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717871.2268817-452-225349945916023/AnsiballZ_stat.py'
Dec 02 23:24:31 compute-1 sudo[36518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:32 compute-1 python3.9[36520]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:24:32 compute-1 sudo[36518]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:32 compute-1 sudo[36641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtnnvldvejzzhoezxncbcxfrevjqzxup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717871.2268817-452-225349945916023/AnsiballZ_copy.py'
Dec 02 23:24:32 compute-1 sudo[36641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:34 compute-1 python3.9[36643]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764717871.2268817-452-225349945916023/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:24:34 compute-1 sudo[36641]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:38 compute-1 sudo[36793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdrsssnjjctedwfarauhumptncopwbem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717878.384205-500-133560159627587/AnsiballZ_stat.py'
Dec 02 23:24:38 compute-1 sudo[36793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:40 compute-1 python3.9[36795]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:24:40 compute-1 sudo[36793]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:40 compute-1 sudo[36945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-strwtzioqpicrdzxffvlcybaufqpjrfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717880.6184824-516-274460992317257/AnsiballZ_command.py'
Dec 02 23:24:40 compute-1 sudo[36945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:41 compute-1 python3.9[36947]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:24:41 compute-1 sudo[36945]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:41 compute-1 sudo[37098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osvwwxngfsudhxqjcbvjpnehxlyaflqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717881.4441743-532-271624500675816/AnsiballZ_file.py'
Dec 02 23:24:41 compute-1 sudo[37098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:41 compute-1 python3.9[37100]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:24:42 compute-1 sudo[37098]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:42 compute-1 sudo[37250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adbkkcyrozzlbkduzaebfzczjqurioti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717882.434572-554-40409533188374/AnsiballZ_getent.py'
Dec 02 23:24:42 compute-1 sudo[37250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:43 compute-1 python3.9[37252]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 02 23:24:43 compute-1 sudo[37250]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:43 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:24:43 compute-1 sudo[37404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcnjjosljnjovdrptbrrmugvvwurlegz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717883.327434-570-223919180951200/AnsiballZ_group.py'
Dec 02 23:24:43 compute-1 sudo[37404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:44 compute-1 python3.9[37406]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:24:44 compute-1 groupadd[37407]: group added to /etc/group: name=qemu, GID=107
Dec 02 23:24:44 compute-1 groupadd[37407]: group added to /etc/gshadow: name=qemu
Dec 02 23:24:44 compute-1 groupadd[37407]: new group: name=qemu, GID=107
Dec 02 23:24:44 compute-1 sudo[37404]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:44 compute-1 sudo[37562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjhjkljzzrtsyzzimmhhxsqzfbxtkyfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717884.3085215-586-235860907668429/AnsiballZ_user.py'
Dec 02 23:24:44 compute-1 sudo[37562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:45 compute-1 python3.9[37564]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:24:45 compute-1 useradd[37566]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:24:45 compute-1 sudo[37562]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:45 compute-1 sudo[37722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asaztthgetzbfcxtsdgwgpatwegszqkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717885.3841977-602-184270183915366/AnsiballZ_getent.py'
Dec 02 23:24:45 compute-1 sudo[37722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:45 compute-1 python3.9[37724]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 02 23:24:45 compute-1 sudo[37722]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:46 compute-1 sudo[37875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unvohkktfwibzszqsiuyzjnugjdwdvsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717886.2059762-618-32855971655434/AnsiballZ_group.py'
Dec 02 23:24:46 compute-1 sudo[37875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:46 compute-1 python3.9[37877]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:24:46 compute-1 groupadd[37878]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 02 23:24:46 compute-1 groupadd[37878]: group added to /etc/gshadow: name=hugetlbfs
Dec 02 23:24:46 compute-1 groupadd[37878]: new group: name=hugetlbfs, GID=42477
Dec 02 23:24:46 compute-1 sudo[37875]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:47 compute-1 sudo[38033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umajobsgdsvuwgriguwklwciiwvldghh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717887.099732-636-138704528071784/AnsiballZ_file.py'
Dec 02 23:24:47 compute-1 sudo[38033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:47 compute-1 python3.9[38035]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 02 23:24:47 compute-1 sudo[38033]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:48 compute-1 sudo[38185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssymfbdertiqgfwrhqcgrbdovdabohif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717888.1067424-658-147886767025810/AnsiballZ_dnf.py'
Dec 02 23:24:48 compute-1 sudo[38185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:48 compute-1 python3.9[38187]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:24:50 compute-1 sudo[38185]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:50 compute-1 sudo[38338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfnjdklcisutsjuxjstwenpkkrvyfmic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717890.545301-674-141221123183104/AnsiballZ_file.py'
Dec 02 23:24:50 compute-1 sudo[38338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:51 compute-1 python3.9[38340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:51 compute-1 sudo[38338]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:51 compute-1 sudo[38490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lviteksixuxzjsdfgbcdxjbilkngzqde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717891.3224254-690-202980748717384/AnsiballZ_stat.py'
Dec 02 23:24:51 compute-1 sudo[38490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:51 compute-1 python3.9[38492]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:24:51 compute-1 sudo[38490]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:52 compute-1 sudo[38613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukpyuodhklazmpiwefqddjlgqzjyafmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717891.3224254-690-202980748717384/AnsiballZ_copy.py'
Dec 02 23:24:52 compute-1 sudo[38613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:52 compute-1 python3.9[38615]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764717891.3224254-690-202980748717384/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:52 compute-1 sudo[38613]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:53 compute-1 sudo[38765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctsxirkwdnlbwtboruaipmahtspifwwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717892.7213922-721-219468040320581/AnsiballZ_systemd.py'
Dec 02 23:24:53 compute-1 sudo[38765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:53 compute-1 python3.9[38767]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:24:53 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 02 23:24:53 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 02 23:24:53 compute-1 kernel: Bridge firewalling registered
Dec 02 23:24:53 compute-1 systemd-modules-load[38771]: Inserted module 'br_netfilter'
Dec 02 23:24:53 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 02 23:24:53 compute-1 sudo[38765]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:54 compute-1 sudo[38924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftvxykraxdonxcjevdjujbkptaqixamj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717894.023121-736-52006608637641/AnsiballZ_stat.py'
Dec 02 23:24:54 compute-1 sudo[38924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:54 compute-1 python3.9[38926]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:24:54 compute-1 sudo[38924]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:55 compute-1 sudo[39047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipwglqlfkhqqvxpqixjctvkvyjunucwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717894.023121-736-52006608637641/AnsiballZ_copy.py'
Dec 02 23:24:55 compute-1 sudo[39047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:55 compute-1 python3.9[39049]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764717894.023121-736-52006608637641/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:55 compute-1 sudo[39047]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:56 compute-1 sudo[39199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fstxuyumdrgxsquntziulndmokoocuoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717895.822939-772-148380040543381/AnsiballZ_dnf.py'
Dec 02 23:24:56 compute-1 sudo[39199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:56 compute-1 python3.9[39201]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:24:59 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:24:59 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:25:00 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:25:00 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:25:00 compute-1 systemd[1]: Reloading.
Dec 02 23:25:00 compute-1 systemd-rc-local-generator[39264]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:25:00 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:25:00 compute-1 sudo[39199]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:02 compute-1 python3.9[41074]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:25:02 compute-1 python3.9[41937]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 02 23:25:03 compute-1 python3.9[42819]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:25:03 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:25:03 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:25:03 compute-1 systemd[1]: man-db-cache-update.service: Consumed 4.660s CPU time.
Dec 02 23:25:03 compute-1 systemd[1]: run-r7c8a9045b56a4a919454ce59975da57e.service: Deactivated successfully.
Dec 02 23:25:04 compute-1 sudo[43387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jimtgbfbxwrfpayjthgitbaquwjgazim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717904.0066206-850-161455678433264/AnsiballZ_command.py'
Dec 02 23:25:04 compute-1 sudo[43387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:04 compute-1 python3.9[43389]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:04 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 23:25:04 compute-1 systemd[1]: Starting Authorization Manager...
Dec 02 23:25:05 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 23:25:05 compute-1 polkitd[43606]: Started polkitd version 0.117
Dec 02 23:25:05 compute-1 polkitd[43606]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 23:25:05 compute-1 polkitd[43606]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 23:25:05 compute-1 polkitd[43606]: Finished loading, compiling and executing 2 rules
Dec 02 23:25:05 compute-1 systemd[1]: Started Authorization Manager.
Dec 02 23:25:05 compute-1 polkitd[43606]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 02 23:25:05 compute-1 sudo[43387]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:05 compute-1 sudo[43774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qetnrdimzuzqfjhkilokkklgnepefbxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717905.5470648-868-54365016922578/AnsiballZ_systemd.py'
Dec 02 23:25:05 compute-1 sudo[43774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:06 compute-1 python3.9[43776]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:25:06 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 02 23:25:06 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Dec 02 23:25:06 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 02 23:25:06 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 23:25:06 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 23:25:06 compute-1 sudo[43774]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:07 compute-1 python3.9[43938]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 02 23:25:10 compute-1 sudo[44088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iffyzphpwwzrurkujlvypmvykdrchbke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717910.248221-982-109543656111879/AnsiballZ_systemd.py'
Dec 02 23:25:10 compute-1 sudo[44088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:10 compute-1 python3.9[44090]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:25:10 compute-1 systemd[1]: Reloading.
Dec 02 23:25:10 compute-1 systemd-rc-local-generator[44114]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:25:11 compute-1 sudo[44088]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:11 compute-1 sudo[44277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrnlfooubyyyrolljswigevdmaddginz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717911.3359683-982-145843452898306/AnsiballZ_systemd.py'
Dec 02 23:25:11 compute-1 sudo[44277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:11 compute-1 python3.9[44279]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:25:12 compute-1 systemd[1]: Reloading.
Dec 02 23:25:12 compute-1 systemd-rc-local-generator[44309]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:25:12 compute-1 sudo[44277]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:13 compute-1 sudo[44466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaornnngpyxorhymoejnphfjjtjneenq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717912.682409-1015-237890411836747/AnsiballZ_command.py'
Dec 02 23:25:13 compute-1 sudo[44466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:13 compute-1 python3.9[44468]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:13 compute-1 sudo[44466]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:13 compute-1 sudo[44619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otisadciglspbyaskcnyiavrsqyobpty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717913.5046735-1031-262766530790554/AnsiballZ_command.py'
Dec 02 23:25:13 compute-1 sudo[44619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:14 compute-1 python3.9[44621]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:14 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 02 23:25:14 compute-1 sudo[44619]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:14 compute-1 sudo[44772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfccrwszrdidkzmtjxuqjcrrhgscsbqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717914.3349967-1046-124491315327860/AnsiballZ_command.py'
Dec 02 23:25:14 compute-1 sudo[44772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:14 compute-1 python3.9[44774]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:16 compute-1 sudo[44772]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:16 compute-1 sudo[44934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hichamgjcmixxywkfcvpoaoegdzmxcec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717916.6317742-1062-6528641438064/AnsiballZ_command.py'
Dec 02 23:25:16 compute-1 sudo[44934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:17 compute-1 python3.9[44936]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:17 compute-1 sudo[44934]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:17 compute-1 sudo[45087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adxyzranbzggwqiabqzpnfmfrgpbqsaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717917.357859-1078-166524187933529/AnsiballZ_systemd.py'
Dec 02 23:25:17 compute-1 sudo[45087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:18 compute-1 python3.9[45089]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:25:18 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 23:25:18 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Dec 02 23:25:18 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Dec 02 23:25:18 compute-1 systemd[1]: Starting Apply Kernel Variables...
Dec 02 23:25:18 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 23:25:18 compute-1 systemd[1]: Finished Apply Kernel Variables.
Dec 02 23:25:18 compute-1 sudo[45087]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:18 compute-1 sshd-session[31435]: Connection closed by 192.168.122.30 port 58328
Dec 02 23:25:18 compute-1 sshd-session[31432]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:25:18 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Dec 02 23:25:18 compute-1 systemd[1]: session-10.scope: Consumed 2min 14.739s CPU time.
Dec 02 23:25:18 compute-1 systemd-logind[790]: Session 10 logged out. Waiting for processes to exit.
Dec 02 23:25:18 compute-1 systemd-logind[790]: Removed session 10.
Dec 02 23:25:23 compute-1 sshd-session[45120]: Accepted publickey for zuul from 192.168.122.30 port 49656 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:25:23 compute-1 systemd-logind[790]: New session 11 of user zuul.
Dec 02 23:25:23 compute-1 systemd[1]: Started Session 11 of User zuul.
Dec 02 23:25:23 compute-1 sshd-session[45120]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:25:24 compute-1 python3.9[45273]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:25 compute-1 python3.9[45427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:26 compute-1 sudo[45581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgcushbwcdkfsnuukhnjtwhpqvyhughw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717926.490097-81-246787196110223/AnsiballZ_command.py'
Dec 02 23:25:26 compute-1 sudo[45581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:27 compute-1 python3.9[45583]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:27 compute-1 sudo[45581]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:28 compute-1 python3.9[45734]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:29 compute-1 sudo[45888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdbkpaikuiuyeksrsjjroittxtqtgihy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717928.755739-121-239871899665579/AnsiballZ_setup.py'
Dec 02 23:25:29 compute-1 sudo[45888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:29 compute-1 python3.9[45890]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:25:29 compute-1 sudo[45888]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:30 compute-1 sudo[45972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihdnhxbpcddkadwiuhvjnaklquakazzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717928.755739-121-239871899665579/AnsiballZ_dnf.py'
Dec 02 23:25:30 compute-1 sudo[45972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:30 compute-1 python3.9[45974]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:25:31 compute-1 sudo[45972]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:32 compute-1 sudo[46125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htfjmsvixmuqsqinmhsrnpfxjbysyulo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717931.8460221-145-238163296016809/AnsiballZ_setup.py'
Dec 02 23:25:32 compute-1 sudo[46125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:32 compute-1 python3.9[46127]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:25:32 compute-1 sudo[46125]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:33 compute-1 sudo[46296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lisbkchzyrgbjdtkfnbyhvqgrxdhzudj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717932.9827538-167-182710757889088/AnsiballZ_file.py'
Dec 02 23:25:33 compute-1 sudo[46296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:33 compute-1 python3.9[46298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:25:33 compute-1 sudo[46296]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:34 compute-1 sudo[46448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyrppvqfpfbneeuzpqlbqsopwowykbvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717933.9441092-183-229234451501853/AnsiballZ_command.py'
Dec 02 23:25:34 compute-1 sudo[46448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:34 compute-1 python3.9[46450]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat1498189767-merged.mount: Deactivated successfully.
Dec 02 23:25:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck3961272877-merged.mount: Deactivated successfully.
Dec 02 23:25:34 compute-1 podman[46451]: 2025-12-02 23:25:34.617702748 +0000 UTC m=+0.075087017 system refresh
Dec 02 23:25:34 compute-1 sudo[46448]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:35 compute-1 sudo[46610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gckgkzfcjsjekizdajqoyyljhgpmfpxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717934.9369912-199-199902058366953/AnsiballZ_stat.py'
Dec 02 23:25:35 compute-1 sudo[46610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:35 compute-1 python3.9[46612]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:25:35 compute-1 sudo[46610]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:25:36 compute-1 sudo[46733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxoilqottwemjvbicjfuqdueiztfobe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717934.9369912-199-199902058366953/AnsiballZ_copy.py'
Dec 02 23:25:36 compute-1 sudo[46733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:36 compute-1 python3.9[46735]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764717934.9369912-199-199902058366953/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9b55539d0f970291cce6e9e9c5c30e6002831bdc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:25:36 compute-1 sudo[46733]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:36 compute-1 sudo[46885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgobiggpmlhymfcjvpewszinncoqrnul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717936.4657457-229-229971699001799/AnsiballZ_stat.py'
Dec 02 23:25:36 compute-1 sudo[46885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:36 compute-1 python3.9[46887]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:25:36 compute-1 sudo[46885]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:37 compute-1 sudo[47008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdltnoxilfumnaqehoghnisfsdtexsxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717936.4657457-229-229971699001799/AnsiballZ_copy.py'
Dec 02 23:25:37 compute-1 sudo[47008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:37 compute-1 python3.9[47010]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764717936.4657457-229-229971699001799/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51dca2f6e7d675b0597f23a4e044edd3f4faff03 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:37 compute-1 sudo[47008]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:38 compute-1 sudo[47160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvgglrpccxgpyysjwhuoxjjsybhivsuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717937.8797214-261-171075273695167/AnsiballZ_ini_file.py'
Dec 02 23:25:38 compute-1 sudo[47160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:38 compute-1 python3.9[47162]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:38 compute-1 sudo[47160]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:39 compute-1 sudo[47312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yskbzuzfolnxapwcfgbzgwhpsclpxlda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717938.6756184-261-232462481151518/AnsiballZ_ini_file.py'
Dec 02 23:25:39 compute-1 sudo[47312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:39 compute-1 python3.9[47314]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:39 compute-1 sudo[47312]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:39 compute-1 sudo[47464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdzspkxurspijhncgcdsjchvbaoqkzyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717939.4165592-261-1684054002933/AnsiballZ_ini_file.py'
Dec 02 23:25:39 compute-1 sudo[47464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:40 compute-1 python3.9[47466]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:40 compute-1 sudo[47464]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:40 compute-1 sudo[47616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utroiirnyawdmddsrsodfxnwahecmwyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717940.197163-261-238174753627780/AnsiballZ_ini_file.py'
Dec 02 23:25:40 compute-1 sudo[47616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:40 compute-1 python3.9[47618]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:40 compute-1 sudo[47616]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:41 compute-1 python3.9[47768]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:42 compute-1 sudo[47920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkohxgjdhkjpgnbsusztwwxymbayfoso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717941.9518187-341-260431029459948/AnsiballZ_dnf.py'
Dec 02 23:25:42 compute-1 sudo[47920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:42 compute-1 python3.9[47922]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:43 compute-1 sudo[47920]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:44 compute-1 sudo[48073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpagduhlomuipiqquodrbgugikkkpfzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717943.9858594-357-163061443296207/AnsiballZ_dnf.py'
Dec 02 23:25:44 compute-1 sudo[48073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:44 compute-1 python3.9[48075]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:46 compute-1 sudo[48073]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:47 compute-1 sudo[48233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aizszefdxlanfjbyidrettcwsxinpstv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717946.6699593-377-151640186100756/AnsiballZ_dnf.py'
Dec 02 23:25:47 compute-1 sudo[48233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:47 compute-1 python3.9[48235]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:48 compute-1 sudo[48233]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:49 compute-1 sudo[48386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifbtbijabvplpyhenadleifhutequikv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717948.8236456-395-204789280376144/AnsiballZ_dnf.py'
Dec 02 23:25:49 compute-1 sudo[48386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:49 compute-1 python3.9[48388]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:50 compute-1 sudo[48386]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:51 compute-1 sudo[48539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbamymerwmwmoyqwroqkgakuvvunnclj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717951.0634708-417-162059216898244/AnsiballZ_dnf.py'
Dec 02 23:25:51 compute-1 sudo[48539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:51 compute-1 python3.9[48541]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:53 compute-1 sudo[48539]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:53 compute-1 sudo[48695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-korkvqnmutdzszuyqhduhnvzynxldqvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717953.3321958-433-82589985295291/AnsiballZ_dnf.py'
Dec 02 23:25:53 compute-1 sudo[48695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:53 compute-1 python3.9[48697]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:56 compute-1 sudo[48695]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:56 compute-1 sudo[48864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvsbjxfhhamcxvmdsxrhnxjetekhjths ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717956.3968472-451-4704660888035/AnsiballZ_dnf.py'
Dec 02 23:25:56 compute-1 sudo[48864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:56 compute-1 python3.9[48866]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:58 compute-1 sudo[48864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:58 compute-1 sudo[49017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysncoxdfmikoxronhmlpwdgncbqbcczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717958.6077673-469-118004900056920/AnsiballZ_dnf.py'
Dec 02 23:25:58 compute-1 sudo[49017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:59 compute-1 python3.9[49019]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:26:10 compute-1 sudo[49017]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:11 compute-1 sudo[49353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqlollimstfbvdxabrjanbimmrxihzlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717970.6975276-487-257572667407264/AnsiballZ_dnf.py'
Dec 02 23:26:11 compute-1 sudo[49353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:11 compute-1 python3.9[49355]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:26:12 compute-1 sudo[49353]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:13 compute-1 sudo[49509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdpujixffzihmpwlcqhkkxbegernusgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717972.9414937-509-178507724167728/AnsiballZ_file.py'
Dec 02 23:26:13 compute-1 sudo[49509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:13 compute-1 python3.9[49511]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:26:13 compute-1 sudo[49509]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:14 compute-1 sudo[49684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcrcfyxnvohhuguntzctwadhbgtbgtvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717973.857773-525-272997573665847/AnsiballZ_stat.py'
Dec 02 23:26:14 compute-1 sudo[49684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:14 compute-1 python3.9[49686]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:26:14 compute-1 sudo[49684]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:14 compute-1 sudo[49807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymxqvhvbcniwhuopkihwmppplsjaaeca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717973.857773-525-272997573665847/AnsiballZ_copy.py'
Dec 02 23:26:14 compute-1 sudo[49807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:15 compute-1 python3.9[49809]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764717973.857773-525-272997573665847/.source.json _original_basename=.ubbudugs follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:26:15 compute-1 sudo[49807]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:16 compute-1 sudo[49959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmorswrqgmrgmrlranetewzlqiddltxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717975.5321562-561-5377883457723/AnsiballZ_podman_image.py'
Dec 02 23:26:16 compute-1 sudo[49959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:16 compute-1 python3.9[49961]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat4270103455-lower\x2dmapped.mount: Deactivated successfully.
Dec 02 23:26:22 compute-1 podman[49973]: 2025-12-02 23:26:22.471067862 +0000 UTC m=+6.136414242 image pull 78889ae0cf8c3740f43b6df72a2c4568ab589fb816614851d476abc277d3fffb 38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Dec 02 23:26:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:22 compute-1 sudo[49959]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:23 compute-1 sudo[50270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpdrjfmfrfzennxakgwgbcstoujgnxvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717983.1493363-583-175336334892328/AnsiballZ_podman_image.py'
Dec 02 23:26:23 compute-1 sudo[50270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:23 compute-1 python3.9[50272]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:32 compute-1 podman[50284]: 2025-12-02 23:26:32.260651232 +0000 UTC m=+8.586321607 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:26:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:32 compute-1 sudo[50270]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:34 compute-1 sudo[50591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opfdrsntwksgvemrtvxfsrafudxvaumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717994.3516195-603-19574339558026/AnsiballZ_podman_image.py'
Dec 02 23:26:34 compute-1 sudo[50591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:34 compute-1 python3.9[50593]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:35 compute-1 podman[50605]: 2025-12-02 23:26:35.226085199 +0000 UTC m=+0.338139264 image pull 13a8acc03c3934b75192e1b3a8c127f56bf115253a854621e8e0e8b6330d5e9b 38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Dec 02 23:26:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:35 compute-1 sudo[50591]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:36 compute-1 sudo[50841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqhaoibldbtiwxbgikzubkxnariwfnmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717995.9068885-621-143311422249392/AnsiballZ_podman_image.py'
Dec 02 23:26:36 compute-1 sudo[50841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:36 compute-1 python3.9[50843]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:49 compute-1 podman[50857]: 2025-12-02 23:26:49.089137007 +0000 UTC m=+12.561487059 image pull 99c98706e6d475ab9a9b50baf3431e8745aac38f98f776ef6ab7d3c7a2811699 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Dec 02 23:26:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:49 compute-1 sudo[50841]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:52 compute-1 sudo[51115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgmmznkkrfiigqndnqzvraezeesnmlva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718012.1338902-643-47086166110760/AnsiballZ_podman_image.py'
Dec 02 23:26:52 compute-1 sudo[51115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:52 compute-1 python3.9[51117]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-1 podman[51129]: 2025-12-02 23:26:56.165675061 +0000 UTC m=+3.472724249 image pull f524ba1018a442a347cd0e4973fee00e2d9be36d16bf76224f04e0d02efc067e 38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Dec 02 23:26:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-1 sudo[51115]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:56 compute-1 sudo[51384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqkuejuozxkwqplvovgueamfmjdrcboa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718016.6075685-643-13626317541892/AnsiballZ_podman_image.py'
Dec 02 23:26:56 compute-1 sudo[51384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:57 compute-1 python3.9[51386]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:58 compute-1 podman[51398]: 2025-12-02 23:26:58.452443483 +0000 UTC m=+1.243522918 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 02 23:26:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:58 compute-1 sudo[51384]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:59 compute-1 sshd-session[45123]: Connection closed by 192.168.122.30 port 49656
Dec 02 23:26:59 compute-1 sshd-session[45120]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:26:59 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Dec 02 23:26:59 compute-1 systemd[1]: session-11.scope: Consumed 1min 49.942s CPU time.
Dec 02 23:26:59 compute-1 systemd-logind[790]: Session 11 logged out. Waiting for processes to exit.
Dec 02 23:26:59 compute-1 systemd-logind[790]: Removed session 11.
Dec 02 23:27:04 compute-1 sshd-session[51549]: Accepted publickey for zuul from 192.168.122.30 port 53500 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:27:04 compute-1 systemd-logind[790]: New session 12 of user zuul.
Dec 02 23:27:04 compute-1 systemd[1]: Started Session 12 of User zuul.
Dec 02 23:27:04 compute-1 sshd-session[51549]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:27:05 compute-1 python3.9[51702]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:27:06 compute-1 sudo[51856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwqezqsbutgdzpuudolonvkfabldbmcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718026.558261-53-277798811700803/AnsiballZ_getent.py'
Dec 02 23:27:06 compute-1 sudo[51856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:07 compute-1 python3.9[51858]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 02 23:27:07 compute-1 sudo[51856]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:07 compute-1 sudo[52009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcwttedeopsbkzrbpfwjylihtdgcsihz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718027.4418724-69-163591698084543/AnsiballZ_group.py'
Dec 02 23:27:07 compute-1 sudo[52009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:08 compute-1 python3.9[52011]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:27:08 compute-1 groupadd[52012]: group added to /etc/group: name=openvswitch, GID=42476
Dec 02 23:27:08 compute-1 groupadd[52012]: group added to /etc/gshadow: name=openvswitch
Dec 02 23:27:08 compute-1 groupadd[52012]: new group: name=openvswitch, GID=42476
Dec 02 23:27:08 compute-1 sudo[52009]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:09 compute-1 sudo[52167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwxhogqbkmbmnfreitewxoclsqgbbwgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718028.4917748-85-24220664530831/AnsiballZ_user.py'
Dec 02 23:27:09 compute-1 sudo[52167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:09 compute-1 python3.9[52169]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:27:09 compute-1 useradd[52171]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:27:09 compute-1 useradd[52171]: add 'openvswitch' to group 'hugetlbfs'
Dec 02 23:27:09 compute-1 useradd[52171]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 02 23:27:09 compute-1 sudo[52167]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:10 compute-1 sudo[52327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxtxcmegksrvsuglhcxtceeuskkxitou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718029.927997-105-240461236428153/AnsiballZ_setup.py'
Dec 02 23:27:10 compute-1 sudo[52327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:10 compute-1 python3.9[52329]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:27:10 compute-1 sudo[52327]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:11 compute-1 sudo[52411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugqrbeprfzaoxogzprfhawclkwqnzjqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718029.927997-105-240461236428153/AnsiballZ_dnf.py'
Dec 02 23:27:11 compute-1 sudo[52411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:11 compute-1 python3.9[52413]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:27:12 compute-1 sudo[52411]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:13 compute-1 sudo[52573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tudaxzfpwygvwtdtbrqnmhqtyrelptyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718033.2735019-133-173119463977819/AnsiballZ_dnf.py'
Dec 02 23:27:13 compute-1 sudo[52573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:14 compute-1 python3.9[52575]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:19 compute-1 sshd-session[52590]: Invalid user ubnt from 185.156.73.233 port 43530
Dec 02 23:27:19 compute-1 sshd-session[52590]: Connection closed by invalid user ubnt 185.156.73.233 port 43530 [preauth]
Dec 02 23:27:26 compute-1 kernel: SELinux:  Converting 2733 SID table entries...
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:27:26 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:27:26 compute-1 groupadd[52600]: group added to /etc/group: name=unbound, GID=993
Dec 02 23:27:26 compute-1 groupadd[52600]: group added to /etc/gshadow: name=unbound
Dec 02 23:27:26 compute-1 groupadd[52600]: new group: name=unbound, GID=993
Dec 02 23:27:26 compute-1 useradd[52607]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 02 23:27:26 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 02 23:27:26 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 02 23:27:27 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:27:27 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:27:27 compute-1 systemd[1]: Reloading.
Dec 02 23:27:27 compute-1 systemd-rc-local-generator[53104]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:27 compute-1 systemd-sysv-generator[53107]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:27 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:27:28 compute-1 sudo[52573]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:28 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:27:28 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:27:28 compute-1 systemd[1]: run-r39d7150317f1470d93ea66da013762e2.service: Deactivated successfully.
Dec 02 23:27:30 compute-1 sudo[53674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xinrrnyrbjtwqxgcifwjhhkxheifwzaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718049.433881-149-76031531318478/AnsiballZ_systemd.py'
Dec 02 23:27:30 compute-1 sudo[53674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:30 compute-1 python3.9[53676]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:27:30 compute-1 systemd[1]: Reloading.
Dec 02 23:27:30 compute-1 systemd-rc-local-generator[53707]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:30 compute-1 systemd-sysv-generator[53711]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:30 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Dec 02 23:27:30 compute-1 chown[53718]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 02 23:27:30 compute-1 ovs-ctl[53723]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 02 23:27:30 compute-1 ovs-ctl[53723]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 02 23:27:30 compute-1 ovs-ctl[53723]: Starting ovsdb-server [  OK  ]
Dec 02 23:27:30 compute-1 ovs-vsctl[53772]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 02 23:27:31 compute-1 ovs-vsctl[53788]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"e895a64d-10b7-4a6e-a7ff-0745f1562623\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 02 23:27:31 compute-1 ovs-ctl[53723]: Configuring Open vSwitch system IDs [  OK  ]
Dec 02 23:27:31 compute-1 ovs-ctl[53723]: Enabling remote OVSDB managers [  OK  ]
Dec 02 23:27:31 compute-1 ovs-vsctl[53797]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec 02 23:27:31 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Dec 02 23:27:31 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 02 23:27:31 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 02 23:27:31 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 02 23:27:31 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Dec 02 23:27:31 compute-1 ovs-ctl[53843]: Inserting openvswitch module [  OK  ]
Dec 02 23:27:31 compute-1 ovs-ctl[53812]: Starting ovs-vswitchd [  OK  ]
Dec 02 23:27:31 compute-1 ovs-vsctl[53861]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec 02 23:27:31 compute-1 ovs-ctl[53812]: Enabling remote OVSDB managers [  OK  ]
Dec 02 23:27:31 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 02 23:27:31 compute-1 systemd[1]: Starting Open vSwitch...
Dec 02 23:27:31 compute-1 systemd[1]: Finished Open vSwitch.
Dec 02 23:27:31 compute-1 sudo[53674]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:32 compute-1 python3.9[54013]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:27:33 compute-1 sudo[54163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sysfbbwwgmahloseatwytoedvjxilfjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718052.694114-185-104226638931962/AnsiballZ_sefcontext.py'
Dec 02 23:27:33 compute-1 sudo[54163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:33 compute-1 python3.9[54165]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 02 23:27:34 compute-1 kernel: SELinux:  Converting 2747 SID table entries...
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:27:34 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:27:34 compute-1 sudo[54163]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:35 compute-1 python3.9[54320]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:27:36 compute-1 sudo[54476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgzzcnffaqzxjyjfwpzlzllxeogkmjju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718056.2426293-221-215791653499107/AnsiballZ_dnf.py'
Dec 02 23:27:36 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 02 23:27:36 compute-1 sudo[54476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:36 compute-1 python3.9[54478]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:38 compute-1 sudo[54476]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:38 compute-1 sudo[54629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtwwpolgbvdbkdtjjzdhnjojrscsmqpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718058.2890055-237-51154014863518/AnsiballZ_command.py'
Dec 02 23:27:38 compute-1 sudo[54629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:38 compute-1 python3.9[54631]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:27:39 compute-1 sudo[54629]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:40 compute-1 sudo[54916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymqjaxlivwcwrkrqxfttijiezbjtgjhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718060.0116227-253-124372461576836/AnsiballZ_file.py'
Dec 02 23:27:40 compute-1 sudo[54916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:40 compute-1 python3.9[54918]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 23:27:40 compute-1 sudo[54916]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:41 compute-1 python3.9[55068]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:27:42 compute-1 sudo[55220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmwmrflnbddxdfpvhzhamwzhqqucwucc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718062.0021005-285-219664223247347/AnsiballZ_dnf.py'
Dec 02 23:27:42 compute-1 sudo[55220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:42 compute-1 python3.9[55222]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:44 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:27:44 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:27:44 compute-1 systemd[1]: Reloading.
Dec 02 23:27:44 compute-1 systemd-rc-local-generator[55263]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:44 compute-1 systemd-sysv-generator[55267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:44 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:27:44 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:27:44 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:27:44 compute-1 systemd[1]: run-r1741d6e876414e2ebe84e66462aaf7c6.service: Deactivated successfully.
Dec 02 23:27:44 compute-1 sudo[55220]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:45 compute-1 sudo[55538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwndwhntcbpusvccdqqzncvzuteuettu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718065.1615918-301-182790804543130/AnsiballZ_systemd.py'
Dec 02 23:27:45 compute-1 sudo[55538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:45 compute-1 python3.9[55540]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:27:45 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 02 23:27:45 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Dec 02 23:27:45 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Dec 02 23:27:45 compute-1 systemd[1]: Stopping Network Manager...
Dec 02 23:27:45 compute-1 NetworkManager[7188]: <info>  [1764718065.8721] caught SIGTERM, shutting down normally.
Dec 02 23:27:45 compute-1 NetworkManager[7188]: <info>  [1764718065.8738] dhcp4 (eth0): canceled DHCP transaction
Dec 02 23:27:45 compute-1 NetworkManager[7188]: <info>  [1764718065.8738] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 23:27:45 compute-1 NetworkManager[7188]: <info>  [1764718065.8738] dhcp4 (eth0): state changed no lease
Dec 02 23:27:45 compute-1 NetworkManager[7188]: <info>  [1764718065.8740] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 23:27:45 compute-1 NetworkManager[7188]: <info>  [1764718065.8799] exiting (success)
Dec 02 23:27:45 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 23:27:45 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 23:27:45 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 02 23:27:45 compute-1 systemd[1]: Stopped Network Manager.
Dec 02 23:27:45 compute-1 systemd[1]: NetworkManager.service: Consumed 13.140s CPU time, 4.4M memory peak, read 0B from disk, written 41.0K to disk.
Dec 02 23:27:45 compute-1 systemd[1]: Starting Network Manager...
Dec 02 23:27:45 compute-1 NetworkManager[55553]: <info>  [1764718065.9577] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f1d327b1-3d2c-4b37-9ec1-7a4c7cbc8c21)
Dec 02 23:27:45 compute-1 NetworkManager[55553]: <info>  [1764718065.9580] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 02 23:27:45 compute-1 NetworkManager[55553]: <info>  [1764718065.9651] manager[0x55fec4020090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 23:27:45 compute-1 systemd[1]: Starting Hostname Service...
Dec 02 23:27:46 compute-1 systemd[1]: Started Hostname Service.
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0854] hostname: hostname: using hostnamed
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0854] hostname: static hostname changed from (none) to "compute-1"
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0859] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0863] manager[0x55fec4020090]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0863] manager[0x55fec4020090]: rfkill: WWAN hardware radio set enabled
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0883] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0890] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0891] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0891] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0891] manager: Networking is enabled by state file
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0893] settings: Loaded settings plugin: keyfile (internal)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0896] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0920] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0929] dhcp: init: Using DHCP client 'internal'
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0932] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0936] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0941] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0948] device (lo): Activation: starting connection 'lo' (625c3601-1ec7-443e-9214-f1cc220bd16a)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0953] device (eth0): carrier: link connected
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0957] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0961] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0961] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0966] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0971] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0976] device (eth1): carrier: link connected
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0980] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0985] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (5b544c7d-595f-5c85-b896-4057860a4650) (indicated)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0985] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0989] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.0994] device (eth1): Activation: starting connection 'ci-private-network' (5b544c7d-595f-5c85-b896-4057860a4650)
Dec 02 23:27:46 compute-1 systemd[1]: Started Network Manager.
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1000] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1006] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1008] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1009] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1011] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1013] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1015] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1017] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1019] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1025] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1027] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1034] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1044] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1051] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1053] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1058] device (lo): Activation: successful, device activated.
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1064] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1066] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1068] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1071] device (eth1): Activation: successful, device activated.
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1078] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1085] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 23:27:46 compute-1 systemd[1]: Starting Network Manager Wait Online...
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1162] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1196] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1199] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1204] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1210] device (eth0): Activation: successful, device activated.
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1217] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 23:27:46 compute-1 NetworkManager[55553]: <info>  [1764718066.1252] manager: startup complete
Dec 02 23:27:46 compute-1 sudo[55538]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:46 compute-1 systemd[1]: Finished Network Manager Wait Online.
Dec 02 23:27:46 compute-1 sudo[55764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgqpkchzwomvdgtrpnxfowdphqqyntqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718066.3458068-317-238016069277925/AnsiballZ_dnf.py'
Dec 02 23:27:46 compute-1 sudo[55764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:46 compute-1 python3.9[55766]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:51 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:27:51 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:27:51 compute-1 systemd[1]: Reloading.
Dec 02 23:27:51 compute-1 systemd-rc-local-generator[55821]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:51 compute-1 systemd-sysv-generator[55824]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:51 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:27:52 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:27:52 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:27:52 compute-1 systemd[1]: run-r812965f3f99b4198b472dbb78e7588cd.service: Deactivated successfully.
Dec 02 23:27:52 compute-1 sudo[55764]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:53 compute-1 sudo[56224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezldprrjegkxvkfzfjfdgkoljpaqfsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718073.2969534-341-115519919909801/AnsiballZ_stat.py'
Dec 02 23:27:53 compute-1 sudo[56224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:53 compute-1 python3.9[56226]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:27:53 compute-1 sudo[56224]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:54 compute-1 sudo[56376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpicbhoeqijlhsavvbydcizjqsebgqnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718074.1459763-359-1568949711275/AnsiballZ_ini_file.py'
Dec 02 23:27:54 compute-1 sudo[56376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:54 compute-1 python3.9[56378]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:54 compute-1 sudo[56376]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:55 compute-1 sudo[56530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymuyguyuqkrkxayzoiwxmgkedrblqahq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718075.2648108-379-127814940867823/AnsiballZ_ini_file.py'
Dec 02 23:27:55 compute-1 sudo[56530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:55 compute-1 python3.9[56532]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:55 compute-1 sudo[56530]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:56 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 23:27:56 compute-1 sudo[56682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opxnzjdfbzxfhdfjwnfhcsveogkoektc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718075.9411426-379-199014920366750/AnsiballZ_ini_file.py'
Dec 02 23:27:56 compute-1 sudo[56682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:56 compute-1 python3.9[56684]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:56 compute-1 sudo[56682]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:57 compute-1 sudo[56834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmzntkvyjwzwxiwxvrmjrsrzzqefhpky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718076.7590444-409-185171440660512/AnsiballZ_ini_file.py'
Dec 02 23:27:57 compute-1 sudo[56834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:57 compute-1 python3.9[56836]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:57 compute-1 sudo[56834]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:57 compute-1 sudo[56986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfbfnrlnjdtarfettuddthhkjbyothff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718077.499278-409-244739320974698/AnsiballZ_ini_file.py'
Dec 02 23:27:57 compute-1 sudo[56986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:58 compute-1 python3.9[56988]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:58 compute-1 sudo[56986]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:58 compute-1 sudo[57138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdaxlhdrckmcvtcwlyqplfnxlechkyzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718078.2612517-439-253856939081024/AnsiballZ_stat.py'
Dec 02 23:27:58 compute-1 sudo[57138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:58 compute-1 python3.9[57140]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:27:58 compute-1 sudo[57138]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:59 compute-1 sudo[57261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tisqmolcngokjhyjsmbwmuilsuftokri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718078.2612517-439-253856939081024/AnsiballZ_copy.py'
Dec 02 23:27:59 compute-1 sudo[57261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:59 compute-1 python3.9[57263]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718078.2612517-439-253856939081024/.source _original_basename=.vo67n6kh follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:59 compute-1 sudo[57261]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:59 compute-1 sudo[57413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkemzzbtbwzbihugqcxxjkguqnktmfkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718079.612907-469-24906972945202/AnsiballZ_file.py'
Dec 02 23:27:59 compute-1 sudo[57413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:00 compute-1 python3.9[57415]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:00 compute-1 sudo[57413]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:00 compute-1 sudo[57565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywuyigskavdsyfhyvgvzuypomwgiixkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718080.3352692-485-41469447977165/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 02 23:28:00 compute-1 sudo[57565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:00 compute-1 python3.9[57567]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 02 23:28:00 compute-1 sudo[57565]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:01 compute-1 sudo[57717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efhwgwzezfshaankioazgpicmebyaczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718081.296638-503-210069778666817/AnsiballZ_file.py'
Dec 02 23:28:01 compute-1 sudo[57717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:01 compute-1 python3.9[57719]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:01 compute-1 sudo[57717]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:02 compute-1 sudo[57869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocrpyybigjlgifzecdxpknxqzwfwperx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718082.3072584-523-146508635749886/AnsiballZ_stat.py'
Dec 02 23:28:02 compute-1 sudo[57869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:02 compute-1 sudo[57869]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:03 compute-1 sudo[57992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ittlzkrypmcliafphremvhguwgtfpngn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718082.3072584-523-146508635749886/AnsiballZ_copy.py'
Dec 02 23:28:03 compute-1 sudo[57992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:03 compute-1 sudo[57992]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:04 compute-1 sudo[58144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejppwhfaeeltxqhdssqabzxtdwhhaauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718083.5823305-553-190963788595281/AnsiballZ_slurp.py'
Dec 02 23:28:04 compute-1 sudo[58144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:04 compute-1 python3.9[58146]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 02 23:28:04 compute-1 sudo[58144]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:05 compute-1 sudo[58319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vagrsjidsvhrlkvvgktjpvpoavdfrqzl ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718084.5001392-571-181615950720904/async_wrapper.py j820150536824 300 /home/zuul/.ansible/tmp/ansible-tmp-1764718084.5001392-571-181615950720904/AnsiballZ_edpm_os_net_config.py _'
Dec 02 23:28:05 compute-1 sudo[58319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:05 compute-1 ansible-async_wrapper.py[58321]: Invoked with j820150536824 300 /home/zuul/.ansible/tmp/ansible-tmp-1764718084.5001392-571-181615950720904/AnsiballZ_edpm_os_net_config.py _
Dec 02 23:28:05 compute-1 ansible-async_wrapper.py[58324]: Starting module and watcher
Dec 02 23:28:05 compute-1 ansible-async_wrapper.py[58324]: Start watching 58325 (300)
Dec 02 23:28:05 compute-1 ansible-async_wrapper.py[58325]: Start module (58325)
Dec 02 23:28:05 compute-1 ansible-async_wrapper.py[58321]: Return async_wrapper task started.
Dec 02 23:28:05 compute-1 sudo[58319]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:05 compute-1 python3.9[58326]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 02 23:28:06 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 02 23:28:06 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 02 23:28:06 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 02 23:28:06 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 02 23:28:06 compute-1 kernel: cfg80211: failed to load regulatory.db
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2302] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2321] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2782] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2785] audit: op="connection-add" uuid="cdf731e5-a425-4d4a-b8cc-614b6de6632b" name="br-ex-br" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2800] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2802] audit: op="connection-add" uuid="fe3eba47-03d3-4d94-aae3-3e59caf7848e" name="br-ex-port" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2813] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2815] audit: op="connection-add" uuid="78b6b546-e49a-4d5f-a018-fbc8888f3186" name="eth1-port" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2826] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2827] audit: op="connection-add" uuid="b59fb406-f31a-47e4-9c70-b9093baade22" name="vlan20-port" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2838] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2840] audit: op="connection-add" uuid="9ad48a68-8132-4109-bc0c-4f3a11e56868" name="vlan21-port" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2850] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2851] audit: op="connection-add" uuid="07c2ffd9-8858-4cd5-927e-02ef048d68f1" name="vlan22-port" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2867] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2881] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2882] audit: op="connection-add" uuid="92793f23-f2ce-4693-9e89-c0859d721e73" name="br-ex-if" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2932] audit: op="connection-update" uuid="5b544c7d-595f-5c85-b896-4057860a4650" name="ci-private-network" args="ipv4.addresses,ipv4.never-default,ipv4.routes,ipv4.dns,ipv4.method,ipv4.routing-rules,ovs-external-ids.data,connection.timestamp,connection.controller,connection.master,connection.port-type,connection.slave-type,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routes,ipv6.dns,ipv6.method,ipv6.routing-rules,ovs-interface.type" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2946] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2947] audit: op="connection-add" uuid="4adbec78-5250-4f65-affb-e1edbf8192d2" name="vlan20-if" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2959] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2961] audit: op="connection-add" uuid="e71a63ff-3e01-470c-87aa-5d7052fc9dce" name="vlan21-if" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2973] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2975] audit: op="connection-add" uuid="ce882b67-1e05-43ea-9dc5-b747dddb5ccf" name="vlan22-if" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2983] audit: op="connection-delete" uuid="733da19b-fb13-3701-b718-a40535d4912d" name="Wired connection 1" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.2993] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3001] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3005] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (cdf731e5-a425-4d4a-b8cc-614b6de6632b)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3006] audit: op="connection-activate" uuid="cdf731e5-a425-4d4a-b8cc-614b6de6632b" name="br-ex-br" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3007] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3012] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3015] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (fe3eba47-03d3-4d94-aae3-3e59caf7848e)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3017] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3021] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3025] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (78b6b546-e49a-4d5f-a018-fbc8888f3186)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3026] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3031] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3034] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (b59fb406-f31a-47e4-9c70-b9093baade22)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3036] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3040] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3044] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (9ad48a68-8132-4109-bc0c-4f3a11e56868)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3046] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3050] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3053] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (07c2ffd9-8858-4cd5-927e-02ef048d68f1)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3055] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3057] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3059] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3063] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3067] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3070] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (92793f23-f2ce-4693-9e89-c0859d721e73)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3071] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3074] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3076] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3077] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3078] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3086] device (eth1): disconnecting for new activation request.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3087] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3090] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3092] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3093] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3096] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3100] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3103] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (4adbec78-5250-4f65-affb-e1edbf8192d2)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3104] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3106] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3108] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3109] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3112] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3116] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3119] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (e71a63ff-3e01-470c-87aa-5d7052fc9dce)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3120] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3122] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3124] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3126] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3128] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3132] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3135] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (ce882b67-1e05-43ea-9dc5-b747dddb5ccf)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3136] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3139] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3141] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3142] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3144] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3154] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3156] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3158] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3161] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3167] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3170] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3173] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3176] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3178] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3182] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3186] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3189] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3191] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3195] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3199] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3202] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3204] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3208] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3212] dhcp4 (eth0): canceled DHCP transaction
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3213] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3213] dhcp4 (eth0): state changed no lease
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3215] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3223] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58327 uid=0 result="fail" reason="Device is not activated"
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3270] dhcp4 (eth0): state changed new lease, address=38.102.83.74
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3486] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 02 23:28:07 compute-1 kernel: ovs-system: entered promiscuous mode
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3505] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3513] device (eth1): disconnecting for new activation request.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3514] audit: op="connection-activate" uuid="5b544c7d-595f-5c85-b896-4057860a4650" name="ci-private-network" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 kernel: Timeout policy base is empty
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3525] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3532] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 02 23:28:07 compute-1 systemd-udevd[58331]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:28:07 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3579] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Dec 02 23:28:07 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3711] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3813] device (eth1): Activation: starting connection 'ci-private-network' (5b544c7d-595f-5c85-b896-4057860a4650)
Dec 02 23:28:07 compute-1 kernel: br-ex: entered promiscuous mode
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3817] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3826] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3829] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3835] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3840] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3847] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3848] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3850] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3851] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3852] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3858] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3864] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3867] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3870] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3873] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3878] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3882] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3886] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3890] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3893] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3897] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3902] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3906] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3951] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3956] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3958] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3963] device (eth1): Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.3982] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 kernel: vlan22: entered promiscuous mode
Dec 02 23:28:07 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4036] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4039] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4045] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 kernel: vlan21: entered promiscuous mode
Dec 02 23:28:07 compute-1 systemd-udevd[58330]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4111] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4120] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 kernel: vlan20: entered promiscuous mode
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4150] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4151] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4159] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4195] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4209] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4223] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4232] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4243] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4245] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4251] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4260] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4261] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-1 NetworkManager[55553]: <info>  [1764718087.4267] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:08 compute-1 NetworkManager[55553]: <info>  [1764718088.5368] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Dec 02 23:28:08 compute-1 NetworkManager[55553]: <info>  [1764718088.6978] checkpoint[0x55fec3ff6950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 02 23:28:08 compute-1 NetworkManager[55553]: <info>  [1764718088.6982] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Dec 02 23:28:08 compute-1 sudo[58659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnimhtulwiknkemdclrhvyydcrfgtobr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718088.4916894-571-270989575767862/AnsiballZ_async_status.py'
Dec 02 23:28:08 compute-1 sudo[58659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:08 compute-1 NetworkManager[55553]: <info>  [1764718088.9789] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Dec 02 23:28:08 compute-1 NetworkManager[55553]: <info>  [1764718088.9801] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Dec 02 23:28:09 compute-1 python3.9[58661]: ansible-ansible.legacy.async_status Invoked with jid=j820150536824.58321 mode=status _async_dir=/root/.ansible_async
Dec 02 23:28:09 compute-1 sudo[58659]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:09 compute-1 NetworkManager[55553]: <info>  [1764718089.1713] audit: op="networking-control" arg="global-dns-configuration" pid=58327 uid=0 result="success"
Dec 02 23:28:09 compute-1 NetworkManager[55553]: <info>  [1764718089.1757] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 02 23:28:09 compute-1 NetworkManager[55553]: <info>  [1764718089.1790] audit: op="networking-control" arg="global-dns-configuration" pid=58327 uid=0 result="success"
Dec 02 23:28:09 compute-1 NetworkManager[55553]: <info>  [1764718089.1814] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Dec 02 23:28:09 compute-1 NetworkManager[55553]: <info>  [1764718089.3354] checkpoint[0x55fec3ff6a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 02 23:28:09 compute-1 NetworkManager[55553]: <info>  [1764718089.3360] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Dec 02 23:28:09 compute-1 ansible-async_wrapper.py[58325]: Module complete (58325)
Dec 02 23:28:10 compute-1 ansible-async_wrapper.py[58324]: Done in kid B.
Dec 02 23:28:12 compute-1 sudo[58763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aujunwgimaqfaihyptdwaqtgpkioytyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718088.4916894-571-270989575767862/AnsiballZ_async_status.py'
Dec 02 23:28:12 compute-1 sudo[58763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:12 compute-1 python3.9[58765]: ansible-ansible.legacy.async_status Invoked with jid=j820150536824.58321 mode=status _async_dir=/root/.ansible_async
Dec 02 23:28:12 compute-1 sudo[58763]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:13 compute-1 sudo[58863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijhkvvdgwqbhaubnaxgitivcsktugjgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718088.4916894-571-270989575767862/AnsiballZ_async_status.py'
Dec 02 23:28:13 compute-1 sudo[58863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:13 compute-1 python3.9[58865]: ansible-ansible.legacy.async_status Invoked with jid=j820150536824.58321 mode=cleanup _async_dir=/root/.ansible_async
Dec 02 23:28:13 compute-1 sudo[58863]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:13 compute-1 sudo[59015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmaytodlnwrqzzzfrdwazudqkchiilof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718093.4416132-620-212468099992445/AnsiballZ_stat.py'
Dec 02 23:28:13 compute-1 sudo[59015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:13 compute-1 python3.9[59017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:13 compute-1 sudo[59015]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:14 compute-1 sudo[59138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nirqqclncywqhsupnftdyiekrkstlrwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718093.4416132-620-212468099992445/AnsiballZ_copy.py'
Dec 02 23:28:14 compute-1 sudo[59138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:14 compute-1 python3.9[59140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718093.4416132-620-212468099992445/.source.returncode _original_basename=.bte_zz8g follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:14 compute-1 sudo[59138]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:15 compute-1 sudo[59290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stsjpwhsapsykpqnqgsyadpoeyocumjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718094.699998-652-108218398248490/AnsiballZ_stat.py'
Dec 02 23:28:15 compute-1 sudo[59290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:15 compute-1 python3.9[59292]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:15 compute-1 sudo[59290]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:15 compute-1 sudo[59413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lolrebfoqhpsqucvhbqeykfpyrskltza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718094.699998-652-108218398248490/AnsiballZ_copy.py'
Dec 02 23:28:15 compute-1 sudo[59413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:15 compute-1 python3.9[59415]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718094.699998-652-108218398248490/.source.cfg _original_basename=.dwg0c9_a follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:15 compute-1 sudo[59413]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:16 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 23:28:16 compute-1 sudo[59568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnjmsnwlnrpkxskwjwvcsiuchqwuyjxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718095.938422-682-197823417944019/AnsiballZ_systemd.py'
Dec 02 23:28:16 compute-1 sudo[59568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:16 compute-1 python3.9[59570]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:28:16 compute-1 systemd[1]: Reloading Network Manager...
Dec 02 23:28:16 compute-1 NetworkManager[55553]: <info>  [1764718096.6291] audit: op="reload" arg="0" pid=59574 uid=0 result="success"
Dec 02 23:28:16 compute-1 NetworkManager[55553]: <info>  [1764718096.6301] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 02 23:28:16 compute-1 systemd[1]: Reloaded Network Manager.
Dec 02 23:28:16 compute-1 sudo[59568]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:17 compute-1 sshd-session[51552]: Connection closed by 192.168.122.30 port 53500
Dec 02 23:28:17 compute-1 sshd-session[51549]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:28:17 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Dec 02 23:28:17 compute-1 systemd[1]: session-12.scope: Consumed 50.320s CPU time.
Dec 02 23:28:17 compute-1 systemd-logind[790]: Session 12 logged out. Waiting for processes to exit.
Dec 02 23:28:17 compute-1 systemd-logind[790]: Removed session 12.
Dec 02 23:28:22 compute-1 sshd-session[59605]: Accepted publickey for zuul from 192.168.122.30 port 36990 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:28:22 compute-1 systemd-logind[790]: New session 13 of user zuul.
Dec 02 23:28:22 compute-1 systemd[1]: Started Session 13 of User zuul.
Dec 02 23:28:22 compute-1 sshd-session[59605]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:28:23 compute-1 python3.9[59758]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:24 compute-1 python3.9[59912]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:28:26 compute-1 python3.9[60102]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:28:26 compute-1 sshd-session[59608]: Connection closed by 192.168.122.30 port 36990
Dec 02 23:28:26 compute-1 sshd-session[59605]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:28:26 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Dec 02 23:28:26 compute-1 systemd[1]: session-13.scope: Consumed 2.389s CPU time.
Dec 02 23:28:26 compute-1 systemd-logind[790]: Session 13 logged out. Waiting for processes to exit.
Dec 02 23:28:26 compute-1 systemd-logind[790]: Removed session 13.
Dec 02 23:28:26 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 23:28:31 compute-1 sshd-session[60130]: Accepted publickey for zuul from 192.168.122.30 port 45634 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:28:31 compute-1 systemd-logind[790]: New session 14 of user zuul.
Dec 02 23:28:31 compute-1 systemd[1]: Started Session 14 of User zuul.
Dec 02 23:28:31 compute-1 sshd-session[60130]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:28:32 compute-1 python3.9[60284]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:33 compute-1 python3.9[60438]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:34 compute-1 sudo[60592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdrkjvptndnrjgdbfixmvlbybojfugvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718114.093447-61-229811377635908/AnsiballZ_setup.py'
Dec 02 23:28:34 compute-1 sudo[60592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:34 compute-1 python3.9[60594]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:28:34 compute-1 sudo[60592]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:35 compute-1 sudo[60677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llopcmatnvteqecypuyzpabxukphioaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718114.093447-61-229811377635908/AnsiballZ_dnf.py'
Dec 02 23:28:35 compute-1 sudo[60677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:35 compute-1 python3.9[60679]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:28:36 compute-1 sudo[60677]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:37 compute-1 sudo[60830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anhlzxouvvzqdurljnsfgirfyvnkpnfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718116.9444954-85-10976519519452/AnsiballZ_setup.py'
Dec 02 23:28:37 compute-1 sudo[60830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:37 compute-1 python3.9[60832]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:28:37 compute-1 sudo[60830]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:38 compute-1 sudo[61022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edqmbxaumrabaogrwazmmpspgzxusxry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718118.2513843-107-194307407923514/AnsiballZ_file.py'
Dec 02 23:28:38 compute-1 sudo[61022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:39 compute-1 python3.9[61024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:39 compute-1 sudo[61022]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:39 compute-1 sudo[61174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyryuzyyyxhtyvxcceilxetjjlnszqub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718119.3387444-123-58351480835513/AnsiballZ_command.py'
Dec 02 23:28:39 compute-1 sudo[61174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:40 compute-1 python3.9[61176]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:28:40 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:28:40 compute-1 sudo[61174]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:40 compute-1 sudo[61337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdovmqulkfgbaabbhdfbfcwmkadpbkoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718120.3810523-139-193408889166770/AnsiballZ_stat.py'
Dec 02 23:28:40 compute-1 sudo[61337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:41 compute-1 python3.9[61339]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:41 compute-1 sudo[61337]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:41 compute-1 sudo[61415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gukezfkdbgmdzqkkuuuoxrnkwqgqlhiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718120.3810523-139-193408889166770/AnsiballZ_file.py'
Dec 02 23:28:41 compute-1 sudo[61415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:41 compute-1 python3.9[61417]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:41 compute-1 sudo[61415]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:42 compute-1 sudo[61567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzzjcrfctlazizfgcnwdaiysjtzhiwcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718121.8606193-163-266552341324263/AnsiballZ_stat.py'
Dec 02 23:28:42 compute-1 sudo[61567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:42 compute-1 python3.9[61569]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:42 compute-1 sudo[61567]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:42 compute-1 sudo[61645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbozcxvcyuxptzerqsqgknhtmjrbavuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718121.8606193-163-266552341324263/AnsiballZ_file.py'
Dec 02 23:28:42 compute-1 sudo[61645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:42 compute-1 python3.9[61647]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:43 compute-1 sudo[61645]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:43 compute-1 sudo[61797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckvqlkiobwrdrtfigivnafuxhmtsjwgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718123.3243246-189-151277097934238/AnsiballZ_ini_file.py'
Dec 02 23:28:43 compute-1 sudo[61797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:44 compute-1 python3.9[61799]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:44 compute-1 sudo[61797]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:44 compute-1 sudo[61949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlaubrytbnvjzcuakwncstlphvmsnfem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718124.3245933-189-12614573797600/AnsiballZ_ini_file.py'
Dec 02 23:28:44 compute-1 sudo[61949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:44 compute-1 python3.9[61951]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:44 compute-1 sudo[61949]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:45 compute-1 sudo[62101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nznyyzyjuaaywvnnbcwrwsuewidmfkgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718124.9778926-189-73086477590571/AnsiballZ_ini_file.py'
Dec 02 23:28:45 compute-1 sudo[62101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:45 compute-1 python3.9[62103]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:45 compute-1 sudo[62101]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:45 compute-1 sudo[62253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyxazinafqirohvnszyqkysvdfxpjene ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718125.6489532-189-18785177776663/AnsiballZ_ini_file.py'
Dec 02 23:28:45 compute-1 sudo[62253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:46 compute-1 python3.9[62255]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:46 compute-1 sudo[62253]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:46 compute-1 sudo[62405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-futnprgqyzthdqmybxempvhblgnddxdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718126.4312718-251-214399065621802/AnsiballZ_dnf.py'
Dec 02 23:28:46 compute-1 sudo[62405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:46 compute-1 python3.9[62407]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:28:48 compute-1 sudo[62405]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:48 compute-1 sudo[62558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvpixxfssvkstkksjyvbihsqjjudfevb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718128.5945456-273-56787466832683/AnsiballZ_setup.py'
Dec 02 23:28:48 compute-1 sudo[62558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:49 compute-1 python3.9[62560]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:49 compute-1 sudo[62558]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:50 compute-1 sudo[62712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkbpskbespntksjygwokdjynvlygqidl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718130.2187092-289-136201516888722/AnsiballZ_stat.py'
Dec 02 23:28:50 compute-1 sudo[62712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:50 compute-1 python3.9[62714]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:28:50 compute-1 sudo[62712]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:51 compute-1 sudo[62864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwweqvgzhgfuxnrihzxfwagrcsdvrsyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718131.025502-307-241445241893310/AnsiballZ_stat.py'
Dec 02 23:28:51 compute-1 sudo[62864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:51 compute-1 python3.9[62866]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:28:51 compute-1 sudo[62864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:52 compute-1 sudo[63016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpsljcjlhihesucsmuiczyyhgzlzbeyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718131.968394-327-55579893639409/AnsiballZ_command.py'
Dec 02 23:28:52 compute-1 sudo[63016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:52 compute-1 python3.9[63018]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:28:52 compute-1 sudo[63016]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:53 compute-1 sudo[63169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szqfzphfgacctsrzzjyhwirlacteiqaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718132.8231254-347-32840162838461/AnsiballZ_service_facts.py'
Dec 02 23:28:53 compute-1 sudo[63169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:53 compute-1 python3.9[63171]: ansible-service_facts Invoked
Dec 02 23:28:53 compute-1 network[63188]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:28:53 compute-1 network[63189]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:28:53 compute-1 network[63190]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:28:56 compute-1 sudo[63169]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:57 compute-1 sudo[63473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqbuoeswtfzgjaefullmtewemclehchr ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764718137.2438428-377-52085012504231/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764718137.2438428-377-52085012504231/args'
Dec 02 23:28:57 compute-1 sudo[63473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:57 compute-1 sudo[63473]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:58 compute-1 sudo[63640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffsciqxncuhgjmqikjmhbqltmnbpnhpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718137.9740195-399-210987968101438/AnsiballZ_dnf.py'
Dec 02 23:28:58 compute-1 sudo[63640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:58 compute-1 python3.9[63642]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:28:59 compute-1 sudo[63640]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:00 compute-1 sudo[63793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scvpzgugpzuftgttxehwnbjkemogzmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718140.156602-425-278918146921117/AnsiballZ_package_facts.py'
Dec 02 23:29:00 compute-1 sudo[63793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:01 compute-1 python3.9[63795]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 02 23:29:01 compute-1 sudo[63793]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:02 compute-1 sudo[63945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-corsdcdmjxfmecfkgjbmerxwxnszmxxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718141.8805687-446-172262261461796/AnsiballZ_stat.py'
Dec 02 23:29:02 compute-1 sudo[63945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:02 compute-1 python3.9[63947]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:02 compute-1 sudo[63945]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:03 compute-1 sudo[64070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrsdfsghbdllqdvvscjtakembccfhyst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718141.8805687-446-172262261461796/AnsiballZ_copy.py'
Dec 02 23:29:03 compute-1 sudo[64070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:03 compute-1 python3.9[64072]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718141.8805687-446-172262261461796/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:03 compute-1 sudo[64070]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:03 compute-1 sudo[64224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnardhylbagvgimgtcyutfzycxkrfzec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718143.5095558-475-43626587629185/AnsiballZ_stat.py'
Dec 02 23:29:03 compute-1 sudo[64224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:04 compute-1 python3.9[64226]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:04 compute-1 sudo[64224]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:04 compute-1 sudo[64349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojlfxhpntnqvjafodaawootpfailqvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718143.5095558-475-43626587629185/AnsiballZ_copy.py'
Dec 02 23:29:04 compute-1 sudo[64349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:04 compute-1 python3.9[64351]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718143.5095558-475-43626587629185/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:04 compute-1 sudo[64349]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:06 compute-1 sudo[64503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhqtutqzdhrhorffngxdysacttdggmuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718145.4839191-517-85744288920823/AnsiballZ_lineinfile.py'
Dec 02 23:29:06 compute-1 sudo[64503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:06 compute-1 python3.9[64505]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:06 compute-1 sudo[64503]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:07 compute-1 sudo[64657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeefkackspczrpmykpcsnqqksaowygoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718147.1102178-547-260269312909191/AnsiballZ_setup.py'
Dec 02 23:29:07 compute-1 sudo[64657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:07 compute-1 python3.9[64659]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:29:08 compute-1 sudo[64657]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:08 compute-1 sudo[64741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltwiqwquvgvhujxwurfczhovbqpojvxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718147.1102178-547-260269312909191/AnsiballZ_systemd.py'
Dec 02 23:29:08 compute-1 sudo[64741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:08 compute-1 python3.9[64743]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:08 compute-1 sudo[64741]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:09 compute-1 sudo[64895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmawmqxayvcaribwnxqssrbdnujvfcvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718149.5948167-580-231102036913125/AnsiballZ_setup.py'
Dec 02 23:29:09 compute-1 sudo[64895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:10 compute-1 python3.9[64897]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:29:10 compute-1 sudo[64895]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:10 compute-1 sudo[64979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfkjosdeuanfoaubhyhimhecqhltgvbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718149.5948167-580-231102036913125/AnsiballZ_systemd.py'
Dec 02 23:29:10 compute-1 sudo[64979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:11 compute-1 python3.9[64981]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:29:11 compute-1 chronyd[802]: chronyd exiting
Dec 02 23:29:11 compute-1 systemd[1]: Stopping NTP client/server...
Dec 02 23:29:11 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Dec 02 23:29:11 compute-1 systemd[1]: Stopped NTP client/server.
Dec 02 23:29:11 compute-1 systemd[1]: Starting NTP client/server...
Dec 02 23:29:11 compute-1 chronyd[64990]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 23:29:11 compute-1 chronyd[64990]: Frequency -23.432 +/- 0.283 ppm read from /var/lib/chrony/drift
Dec 02 23:29:11 compute-1 chronyd[64990]: Loaded seccomp filter (level 2)
Dec 02 23:29:11 compute-1 systemd[1]: Started NTP client/server.
Dec 02 23:29:11 compute-1 sudo[64979]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:11 compute-1 sshd-session[60133]: Connection closed by 192.168.122.30 port 45634
Dec 02 23:29:11 compute-1 sshd-session[60130]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:29:11 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Dec 02 23:29:11 compute-1 systemd[1]: session-14.scope: Consumed 27.024s CPU time.
Dec 02 23:29:11 compute-1 systemd-logind[790]: Session 14 logged out. Waiting for processes to exit.
Dec 02 23:29:11 compute-1 systemd-logind[790]: Removed session 14.
Dec 02 23:29:14 compute-1 sshd-session[65016]: Received disconnect from 193.46.255.217 port 20687:11:  [preauth]
Dec 02 23:29:14 compute-1 sshd-session[65016]: Disconnected from authenticating user root 193.46.255.217 port 20687 [preauth]
Dec 02 23:29:17 compute-1 sshd-session[65018]: Accepted publickey for zuul from 192.168.122.30 port 45816 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:29:17 compute-1 systemd-logind[790]: New session 15 of user zuul.
Dec 02 23:29:17 compute-1 systemd[1]: Started Session 15 of User zuul.
Dec 02 23:29:17 compute-1 sshd-session[65018]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:29:18 compute-1 python3.9[65171]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:29:19 compute-1 sudo[65325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qumwyagufbthsbpqsnzahkxhgeftcvef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718158.9003193-47-236683966881759/AnsiballZ_file.py'
Dec 02 23:29:19 compute-1 sudo[65325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:19 compute-1 python3.9[65327]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:19 compute-1 sudo[65325]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:20 compute-1 sudo[65500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaiynbhzwdyenzerrpyftwctpccprnop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718159.801808-63-55052634135510/AnsiballZ_stat.py'
Dec 02 23:29:20 compute-1 sudo[65500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:20 compute-1 python3.9[65502]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:20 compute-1 sudo[65500]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:20 compute-1 sudo[65578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfcscbyknwlvmeitsaurmugybsjquisj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718159.801808-63-55052634135510/AnsiballZ_file.py'
Dec 02 23:29:20 compute-1 sudo[65578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:20 compute-1 python3.9[65580]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.p1x2imjb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:20 compute-1 sudo[65578]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:21 compute-1 sudo[65730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkknfjychrknuhuwytbmdlknpgjopsgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718161.539541-103-7461837418183/AnsiballZ_stat.py'
Dec 02 23:29:21 compute-1 sudo[65730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:22 compute-1 python3.9[65732]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:22 compute-1 sudo[65730]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:22 compute-1 sudo[65853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzflrpldrvsormrjiolgqalohwyduxgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718161.539541-103-7461837418183/AnsiballZ_copy.py'
Dec 02 23:29:22 compute-1 sudo[65853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:22 compute-1 python3.9[65855]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718161.539541-103-7461837418183/.source _original_basename=.1wpphcfa follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:22 compute-1 sudo[65853]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:23 compute-1 sudo[66005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyqnvsllgixilfaszgdynbujtvxztfsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718162.947938-135-185824810824777/AnsiballZ_file.py'
Dec 02 23:29:23 compute-1 sudo[66005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:23 compute-1 python3.9[66007]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:29:23 compute-1 sudo[66005]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:24 compute-1 sudo[66157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arqujuxlwcnwxdkikruwqtndcmapghbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718163.6914923-151-241211984943926/AnsiballZ_stat.py'
Dec 02 23:29:24 compute-1 sudo[66157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:24 compute-1 python3.9[66159]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:24 compute-1 sudo[66157]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:24 compute-1 sudo[66280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxgkfkvilueixplrxykhkaercoyemnuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718163.6914923-151-241211984943926/AnsiballZ_copy.py'
Dec 02 23:29:24 compute-1 sudo[66280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:24 compute-1 python3.9[66282]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718163.6914923-151-241211984943926/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:29:24 compute-1 sudo[66280]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:25 compute-1 sudo[66432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnzztmzyydeceuehpzackxelupvujngt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718164.9696956-151-132258027954097/AnsiballZ_stat.py'
Dec 02 23:29:25 compute-1 sudo[66432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:25 compute-1 python3.9[66434]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:25 compute-1 sudo[66432]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:25 compute-1 sudo[66555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkmpzbbuunbrrmolckqgrxeflgtrfifo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718164.9696956-151-132258027954097/AnsiballZ_copy.py'
Dec 02 23:29:25 compute-1 sudo[66555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:26 compute-1 python3.9[66557]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718164.9696956-151-132258027954097/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:29:26 compute-1 sudo[66555]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:26 compute-1 sudo[66707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-delsqrdrztjobovbidvzsebdqgeoqskn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718166.372427-209-175767276716190/AnsiballZ_file.py'
Dec 02 23:29:26 compute-1 sudo[66707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:27 compute-1 python3.9[66709]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:27 compute-1 sudo[66707]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:27 compute-1 sudo[66859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfzhgnbrjaygyelzzgxkoexzaxilmjyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718167.2276661-225-162316123466584/AnsiballZ_stat.py'
Dec 02 23:29:27 compute-1 sudo[66859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:27 compute-1 python3.9[66861]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:27 compute-1 sudo[66859]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:28 compute-1 sudo[66982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbwkolwrtilhqbynqzbxoqiwllnlnmhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718167.2276661-225-162316123466584/AnsiballZ_copy.py'
Dec 02 23:29:28 compute-1 sudo[66982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:28 compute-1 python3.9[66984]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718167.2276661-225-162316123466584/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:28 compute-1 sudo[66982]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:28 compute-1 sudo[67134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhwlnukunjtbsoyftkmhewymnrbbgthz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718168.6467838-255-271373870952842/AnsiballZ_stat.py'
Dec 02 23:29:29 compute-1 sudo[67134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:29 compute-1 python3.9[67136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:29 compute-1 sudo[67134]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:29 compute-1 sudo[67257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wifiiucvtqwipgpocraqieuiikfrtepe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718168.6467838-255-271373870952842/AnsiballZ_copy.py'
Dec 02 23:29:29 compute-1 sudo[67257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:29 compute-1 python3.9[67259]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718168.6467838-255-271373870952842/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:29 compute-1 sudo[67257]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:30 compute-1 sudo[67409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfupiqwpmmklcczskthtzewektjprrru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718170.076373-285-60232731627449/AnsiballZ_systemd.py'
Dec 02 23:29:30 compute-1 sudo[67409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:30 compute-1 python3.9[67411]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:30 compute-1 systemd[1]: Reloading.
Dec 02 23:29:31 compute-1 systemd-rc-local-generator[67439]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:31 compute-1 systemd-sysv-generator[67443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:31 compute-1 systemd[1]: Reloading.
Dec 02 23:29:31 compute-1 systemd-rc-local-generator[67473]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:31 compute-1 systemd-sysv-generator[67480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:31 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Dec 02 23:29:31 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Dec 02 23:29:31 compute-1 sudo[67409]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:32 compute-1 sudo[67637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biucgezqwppqbwqxorjvhozrbgkprexc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718171.7967176-301-189980108659477/AnsiballZ_stat.py'
Dec 02 23:29:32 compute-1 sudo[67637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:32 compute-1 python3.9[67639]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:32 compute-1 sudo[67637]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:32 compute-1 sudo[67760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-werzytduxoidwvtnjufufapqxkxjzkdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718171.7967176-301-189980108659477/AnsiballZ_copy.py'
Dec 02 23:29:32 compute-1 sudo[67760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:33 compute-1 python3.9[67762]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718171.7967176-301-189980108659477/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:33 compute-1 sudo[67760]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:33 compute-1 sudo[67912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwvqtbzewescgxmbafuqoeblimmbdbai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718173.2589366-331-146260961881087/AnsiballZ_stat.py'
Dec 02 23:29:33 compute-1 sudo[67912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:33 compute-1 python3.9[67914]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:33 compute-1 sudo[67912]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:34 compute-1 sudo[68035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhkhhtcldaywhsmupwwlpwdpwfqfslpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718173.2589366-331-146260961881087/AnsiballZ_copy.py'
Dec 02 23:29:34 compute-1 sudo[68035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:34 compute-1 python3.9[68037]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718173.2589366-331-146260961881087/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:34 compute-1 sudo[68035]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:35 compute-1 sudo[68187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvcdgbzykeunkmwszjnimhjfkgmgmloc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718174.709857-361-216038769883396/AnsiballZ_systemd.py'
Dec 02 23:29:35 compute-1 sudo[68187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:35 compute-1 python3.9[68189]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:35 compute-1 systemd[1]: Reloading.
Dec 02 23:29:35 compute-1 systemd-rc-local-generator[68213]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:35 compute-1 systemd-sysv-generator[68217]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:35 compute-1 systemd[1]: Reloading.
Dec 02 23:29:35 compute-1 systemd-rc-local-generator[68253]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:35 compute-1 systemd-sysv-generator[68257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:35 compute-1 systemd[1]: Starting Create netns directory...
Dec 02 23:29:35 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:29:35 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:29:35 compute-1 systemd[1]: Finished Create netns directory.
Dec 02 23:29:36 compute-1 sudo[68187]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:36 compute-1 python3.9[68414]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:29:37 compute-1 network[68431]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:29:37 compute-1 network[68432]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:29:37 compute-1 network[68433]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:29:42 compute-1 sudo[68693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voczwgdldbqhsqtstgvsmftmualkleeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718182.6100683-393-263802585392295/AnsiballZ_systemd.py'
Dec 02 23:29:42 compute-1 sudo[68693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:43 compute-1 python3.9[68695]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:43 compute-1 systemd[1]: Reloading.
Dec 02 23:29:43 compute-1 systemd-sysv-generator[68729]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:43 compute-1 systemd-rc-local-generator[68726]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:43 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 02 23:29:43 compute-1 iptables.init[68736]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 02 23:29:43 compute-1 iptables.init[68736]: iptables: Flushing firewall rules: [  OK  ]
Dec 02 23:29:43 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Dec 02 23:29:43 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 02 23:29:43 compute-1 sudo[68693]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:44 compute-1 sudo[68930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkqtxslhvgplibolhbcxrcvfxajakcds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718183.9995265-393-142248869014039/AnsiballZ_systemd.py'
Dec 02 23:29:44 compute-1 sudo[68930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:44 compute-1 python3.9[68932]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:44 compute-1 sudo[68930]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:45 compute-1 sudo[69084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uimbczosxfphqduhupgumtuowwohpgco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718185.1557136-425-42035866116526/AnsiballZ_systemd.py'
Dec 02 23:29:45 compute-1 sudo[69084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:45 compute-1 python3.9[69086]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:45 compute-1 systemd[1]: Reloading.
Dec 02 23:29:45 compute-1 systemd-rc-local-generator[69109]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:45 compute-1 systemd-sysv-generator[69114]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:46 compute-1 systemd[1]: Starting Netfilter Tables...
Dec 02 23:29:46 compute-1 systemd[1]: Finished Netfilter Tables.
Dec 02 23:29:46 compute-1 sudo[69084]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:46 compute-1 sudo[69275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mudfysfgqpimjyyahwojfrpilljydfdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718186.5163302-441-209963398163082/AnsiballZ_command.py'
Dec 02 23:29:46 compute-1 sudo[69275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:47 compute-1 python3.9[69277]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:29:47 compute-1 sudo[69275]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:48 compute-1 sudo[69428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuupnvjfximsdpogakiutxleabdwxvbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718187.8101487-469-168244616241045/AnsiballZ_stat.py'
Dec 02 23:29:48 compute-1 sudo[69428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:48 compute-1 python3.9[69430]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:48 compute-1 sudo[69428]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:48 compute-1 sudo[69553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdnvwpgrealvcpfwrjgrilyiiygshyms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718187.8101487-469-168244616241045/AnsiballZ_copy.py'
Dec 02 23:29:48 compute-1 sudo[69553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:49 compute-1 python3.9[69555]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718187.8101487-469-168244616241045/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:49 compute-1 sudo[69553]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:49 compute-1 sudo[69706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xatfufwjndyoskkwscivnpelgdzpczlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718189.5356076-499-163915398217592/AnsiballZ_systemd.py'
Dec 02 23:29:49 compute-1 sudo[69706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:50 compute-1 python3.9[69708]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:29:50 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Dec 02 23:29:50 compute-1 sshd[1008]: Received SIGHUP; restarting.
Dec 02 23:29:50 compute-1 sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 02 23:29:50 compute-1 sshd[1008]: Server listening on :: port 22.
Dec 02 23:29:50 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Dec 02 23:29:50 compute-1 sudo[69706]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:50 compute-1 sudo[69862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqmttqiwvhiuuiuhzunvlfdmijhmpndm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718190.6325955-515-192695700519913/AnsiballZ_file.py'
Dec 02 23:29:50 compute-1 sudo[69862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:51 compute-1 python3.9[69864]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:51 compute-1 sudo[69862]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:51 compute-1 sudo[70014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmexdmfinyzezkhjyhzlkjfzqcvbnjlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718191.4328022-531-99933820127022/AnsiballZ_stat.py'
Dec 02 23:29:51 compute-1 sudo[70014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:51 compute-1 python3.9[70016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:51 compute-1 sudo[70014]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:52 compute-1 sudo[70137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwujhnfwunarcogmqklodhhiueiwzuyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718191.4328022-531-99933820127022/AnsiballZ_copy.py'
Dec 02 23:29:52 compute-1 sudo[70137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:52 compute-1 python3.9[70139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718191.4328022-531-99933820127022/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:52 compute-1 sudo[70137]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:53 compute-1 sudo[70289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knxcyuyrboztbstdfifkcppugklvqzhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718193.0080051-567-48622361098814/AnsiballZ_timezone.py'
Dec 02 23:29:53 compute-1 sudo[70289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:53 compute-1 python3.9[70291]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 02 23:29:53 compute-1 systemd[1]: Starting Time & Date Service...
Dec 02 23:29:53 compute-1 systemd[1]: Started Time & Date Service.
Dec 02 23:29:54 compute-1 sudo[70289]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:54 compute-1 sudo[70445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrulrnxgqnuljcrgegamoirlznjbnwzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718194.3509724-585-156035701999131/AnsiballZ_file.py'
Dec 02 23:29:54 compute-1 sudo[70445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:54 compute-1 python3.9[70447]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:54 compute-1 sudo[70445]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:55 compute-1 sudo[70597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzjdgygfoysulslnifyqhttgvvxnblko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718195.175803-601-149717852611440/AnsiballZ_stat.py'
Dec 02 23:29:55 compute-1 sudo[70597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:55 compute-1 python3.9[70599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:55 compute-1 sudo[70597]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:56 compute-1 sudo[70720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekjnddbvddxiowicetehdzezenlfbdar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718195.175803-601-149717852611440/AnsiballZ_copy.py'
Dec 02 23:29:56 compute-1 sudo[70720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:56 compute-1 python3.9[70722]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718195.175803-601-149717852611440/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:56 compute-1 sudo[70720]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:57 compute-1 sudo[70872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfewhndlmztuxlgwklxvxxqiqejqcwbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718196.6439238-631-28576179461085/AnsiballZ_stat.py'
Dec 02 23:29:57 compute-1 sudo[70872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:57 compute-1 python3.9[70874]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:57 compute-1 sudo[70872]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:57 compute-1 sudo[70995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcmdofanpobczluupznsvzppljqgmwon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718196.6439238-631-28576179461085/AnsiballZ_copy.py'
Dec 02 23:29:57 compute-1 sudo[70995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:57 compute-1 python3.9[70997]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718196.6439238-631-28576179461085/.source.yaml _original_basename=.pgd6kn62 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:57 compute-1 sudo[70995]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:58 compute-1 sudo[71147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcwhwmsvsdchqpfqnjkzhvvzdqubgbal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718198.057735-661-165091661471425/AnsiballZ_stat.py'
Dec 02 23:29:58 compute-1 sudo[71147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:58 compute-1 python3.9[71149]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:58 compute-1 sudo[71147]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:58 compute-1 sudo[71270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgfzwrtikakfkfhdhjqbrzjbcrqrteqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718198.057735-661-165091661471425/AnsiballZ_copy.py'
Dec 02 23:29:58 compute-1 sudo[71270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:59 compute-1 python3.9[71272]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718198.057735-661-165091661471425/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:59 compute-1 sudo[71270]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:59 compute-1 sudo[71422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klkofhxnsrnbowkuonhqgnrsjduioait ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718199.4529293-691-241547091523237/AnsiballZ_command.py'
Dec 02 23:29:59 compute-1 sudo[71422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:00 compute-1 python3.9[71424]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:00 compute-1 sudo[71422]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:00 compute-1 sudo[71575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqvjkwjaucuqxpzhkhnrnhsljvxbhykk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718200.3172414-707-50670651480043/AnsiballZ_command.py'
Dec 02 23:30:00 compute-1 sudo[71575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:00 compute-1 python3.9[71577]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:00 compute-1 sudo[71575]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:01 compute-1 sudo[71728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxphyhuqrdgibvivucuvtayjepfwmspb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718201.0749722-723-2453973450503/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:30:01 compute-1 sudo[71728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:01 compute-1 python3[71730]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:30:01 compute-1 sudo[71728]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:02 compute-1 sudo[71880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycdklsohrihuhatmtdvkxasebxcbwgha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718202.1509054-740-271324610065731/AnsiballZ_stat.py'
Dec 02 23:30:02 compute-1 sudo[71880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:02 compute-1 python3.9[71882]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:02 compute-1 sudo[71880]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:03 compute-1 sudo[72003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpxexlrosahxqjhuazqogjaanjjufixh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718202.1509054-740-271324610065731/AnsiballZ_copy.py'
Dec 02 23:30:03 compute-1 sudo[72003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:03 compute-1 python3.9[72005]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718202.1509054-740-271324610065731/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:03 compute-1 sudo[72003]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:03 compute-1 sudo[72155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lotukeusmrcajgmrfsqhhjfjqyirdetp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718203.576413-769-179427717876848/AnsiballZ_stat.py'
Dec 02 23:30:03 compute-1 sudo[72155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:04 compute-1 python3.9[72157]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:04 compute-1 sudo[72155]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:04 compute-1 sudo[72278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awdvsgmuomuuvxmoinhxexsimtnvtpes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718203.576413-769-179427717876848/AnsiballZ_copy.py'
Dec 02 23:30:04 compute-1 sudo[72278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:04 compute-1 python3.9[72280]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718203.576413-769-179427717876848/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:04 compute-1 sudo[72278]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:05 compute-1 sudo[72430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpmxkrbxuddvmwsvftxdbtvbrenismgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718205.0419261-799-229412823172083/AnsiballZ_stat.py'
Dec 02 23:30:05 compute-1 sudo[72430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:05 compute-1 python3.9[72432]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:05 compute-1 sudo[72430]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:06 compute-1 sudo[72553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmgcvtswyvdrhtgdiowrensegldrguw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718205.0419261-799-229412823172083/AnsiballZ_copy.py'
Dec 02 23:30:06 compute-1 sudo[72553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:06 compute-1 python3.9[72555]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718205.0419261-799-229412823172083/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:06 compute-1 sudo[72553]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:06 compute-1 sudo[72705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwpoxckajorfrvateomrimccdrriagmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718206.5312824-829-61632491661151/AnsiballZ_stat.py'
Dec 02 23:30:06 compute-1 sudo[72705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:07 compute-1 python3.9[72707]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:07 compute-1 sudo[72705]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:07 compute-1 sudo[72828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqhbvfjqslayzluguyduevxklcuheebs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718206.5312824-829-61632491661151/AnsiballZ_copy.py'
Dec 02 23:30:07 compute-1 sudo[72828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:07 compute-1 python3.9[72830]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718206.5312824-829-61632491661151/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:07 compute-1 sudo[72828]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:08 compute-1 sudo[72980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjzixiekqjottgjxgvinlfvcokbszpit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718208.0584204-859-249698283116880/AnsiballZ_stat.py'
Dec 02 23:30:08 compute-1 sudo[72980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:08 compute-1 python3.9[72982]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:08 compute-1 sudo[72980]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:09 compute-1 sudo[73103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uetqljybvzodhqmjwhlpplnlblccbiqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718208.0584204-859-249698283116880/AnsiballZ_copy.py'
Dec 02 23:30:09 compute-1 sudo[73103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:09 compute-1 python3.9[73105]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718208.0584204-859-249698283116880/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:09 compute-1 sudo[73103]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:09 compute-1 sudo[73255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjfdwfhrdspztadgbgebhgmzshzdnidf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718209.6951764-889-85268066004034/AnsiballZ_file.py'
Dec 02 23:30:09 compute-1 sudo[73255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:10 compute-1 python3.9[73257]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:10 compute-1 sudo[73255]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:10 compute-1 sudo[73407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfbhcxosklisfzwuahnbeqrpekcuplkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718210.4016206-905-237469138455796/AnsiballZ_command.py'
Dec 02 23:30:10 compute-1 sudo[73407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:10 compute-1 python3.9[73409]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:11 compute-1 sudo[73407]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:11 compute-1 sudo[73566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbqdpuyoyknjojfovdcuqqxkfpwlbujx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718211.2451901-921-44913385963377/AnsiballZ_blockinfile.py'
Dec 02 23:30:11 compute-1 sudo[73566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:12 compute-1 python3.9[73568]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:12 compute-1 sudo[73566]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:12 compute-1 sudo[73719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vruxhksyilsbwaxjpndpctcyjbvqfogq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718212.394053-939-243655266142708/AnsiballZ_file.py'
Dec 02 23:30:12 compute-1 sudo[73719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:12 compute-1 python3.9[73721]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:12 compute-1 sudo[73719]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:13 compute-1 sudo[73871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-covgjivyhchqzpicxouzvunertbyrric ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718213.089831-939-53179508266457/AnsiballZ_file.py'
Dec 02 23:30:13 compute-1 sudo[73871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:13 compute-1 python3.9[73873]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:13 compute-1 sudo[73871]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:14 compute-1 sudo[74023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfeeqvhylwyezxzxxpznzclicqodzebh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718213.9396846-969-23681637153801/AnsiballZ_mount.py'
Dec 02 23:30:14 compute-1 sudo[74023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:14 compute-1 python3.9[74025]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 02 23:30:14 compute-1 sudo[74023]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:14 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:30:14 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:30:15 compute-1 sudo[74177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqmahrrpjgteahqrquawoewmwhqaltem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718214.820357-969-112236235124219/AnsiballZ_mount.py'
Dec 02 23:30:15 compute-1 sudo[74177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:15 compute-1 python3.9[74179]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 02 23:30:15 compute-1 sudo[74177]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:15 compute-1 sshd-session[65021]: Connection closed by 192.168.122.30 port 45816
Dec 02 23:30:15 compute-1 sshd-session[65018]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:15 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Dec 02 23:30:15 compute-1 systemd[1]: session-15.scope: Consumed 41.754s CPU time.
Dec 02 23:30:15 compute-1 systemd-logind[790]: Session 15 logged out. Waiting for processes to exit.
Dec 02 23:30:15 compute-1 systemd-logind[790]: Removed session 15.
Dec 02 23:30:21 compute-1 sshd-session[74205]: Accepted publickey for zuul from 192.168.122.30 port 37738 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:21 compute-1 systemd-logind[790]: New session 16 of user zuul.
Dec 02 23:30:21 compute-1 systemd[1]: Started Session 16 of User zuul.
Dec 02 23:30:21 compute-1 sshd-session[74205]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:30:21 compute-1 sudo[74358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwaqxfuefxdybfkexlfzufefdqfejwwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718221.4054394-18-36701599145422/AnsiballZ_tempfile.py'
Dec 02 23:30:21 compute-1 sudo[74358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:22 compute-1 python3.9[74360]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 02 23:30:22 compute-1 sudo[74358]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:22 compute-1 sudo[74510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfmdjtlmtnvcqirflhuagwyenlkmjlnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718222.334076-42-73759353892690/AnsiballZ_stat.py'
Dec 02 23:30:22 compute-1 sudo[74510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:22 compute-1 python3.9[74512]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:22 compute-1 sudo[74510]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:23 compute-1 sudo[74662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znfqnugnvevzatiqbfgeqbcjuxxrfmgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718223.242291-62-243185189401157/AnsiballZ_setup.py'
Dec 02 23:30:23 compute-1 sudo[74662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:24 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 23:30:24 compute-1 python3.9[74664]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:30:24 compute-1 sudo[74662]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:24 compute-1 sudo[74816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkoaedceuymdpetjztzvgekbuksucngd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718224.4577706-79-88708200057100/AnsiballZ_blockinfile.py'
Dec 02 23:30:24 compute-1 sudo[74816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:25 compute-1 python3.9[74818]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsf4O3TC9mqG5pmtZKzDID4ioApUCWMMcMl4FlQ3yDYoM34JJMwDpWmYo9yQeQH7Zz/mYY4kvj34n9pP6UpZh8YYTgrDO/xB5m08yB19hZlBLNgcS18Dl3aCrBlPC7/HRLSXsBGMqfD0dYlcv577j+jpmLyeex2U43tAJwee5EE74TbgHK0hzWiqONZO0KoJC0q2wyLlOa+dZFsIK2fiLjTjwdANF3t0KH6yhzS2J92gfoAUepv4JPBZWhLkuLrx9JrcJMWhKakHpNoy4vezvWVHBo45bBMlwyJABzPuaDKGqpWVe2XSS7CMZRzLLdmpbxOAin0VmmvpX9tf58g912pgSVPka/24eTGrQyyI5roB63r1vZsR9IUAlwsTO90EmGrxGdvIsWQ/aOthlTYdw84AdrxbSpaCzyMYmvjtCVriPliEwTQXsIQHUKv0KyAD2kMAmgBBd/D3seyLAs+Y2xL+gWwoMxhMc6IZFOrzU5UBBAfnLyma1mkx7C3UEu9fs=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAwn7ixIsEYJITP+z2Du/TZpA7vY8Lre+cVRh8KJ//3C
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKfK6GHTjOVRrF8EHoAX2PLtXv9kBkV3qltQW2BmRTPleQCACp3cMUT6m/r0IFqYo99ZvC9bd5l7MrPzRIPHF/w=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvbCYuawkhhMNcsQPSLD9R8MzWMFWBfNxWjJmMzeDHne2TvGIY0DiG11UE+E0lc8fLWiTHmhGyf0GmYFXup/FZEGgMHiuow4IbBTKxK+1QNjVqFO2S2o7o9zF+NooyO1zc2vbwn6D0Is1C3Zk+kyNKxOqKipgjEeFmN+dLdOtNrq/adI/ddM7mbWoJ2sF51XQHbgEt1Ad0ezxCRV1w6buNRIFym2S6pTAPQnkbaqmgQT3Tuq6e45Yvcnw8RY/QvcsMEhodIUNRGQGu4EkUdnY3bG7ucdWSRq5NpUgGVJVxacGyWuQ3pT6V9Mwb5MmOF3C6Nl4E5in3zUjnxxqrfW2uPHaajuvmDlIVoVzZkkVyc7neL/UZ3sg0G8BhCBllzHACU1ZKBdAhC6sj6fZa7rLtzsXXGQq/7Tt1VLSr4A/hna1l3Re/GZ1nnhILvetATInRD43bQChUO6Qys+jY/aug2jC2YYQzxGcBWHZAsYtdcNvZXu+ilZgyhJlx4Mb5mTk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINhbu3i8/fSUjpuw8K1MdLb5KuV5JdkyD7r8WJXXv5aD
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDr3lh4tg7NfAHVqbHGCl7z1xqpJrlsy9GroQGzPqqhUZoSUzEpLTia7mFOGTkU3wwGaWmgSVJctHRDjBh64t0w=
                                             create=True mode=0644 path=/tmp/ansible.pcxdtall state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:25 compute-1 sudo[74816]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:25 compute-1 sudo[74968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flamglfglzqnhwgdihkquhuvxdlxmlln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718225.335798-95-246204438770588/AnsiballZ_command.py'
Dec 02 23:30:25 compute-1 sudo[74968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:25 compute-1 python3.9[74970]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.pcxdtall' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:25 compute-1 sudo[74968]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:26 compute-1 sudo[75122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtvewuazdkqhjxojscrwzouokdxvasbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718226.2444487-111-6866948315580/AnsiballZ_file.py'
Dec 02 23:30:26 compute-1 sudo[75122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:27 compute-1 python3.9[75124]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.pcxdtall state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:27 compute-1 sudo[75122]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:27 compute-1 sshd-session[74208]: Connection closed by 192.168.122.30 port 37738
Dec 02 23:30:27 compute-1 sshd-session[74205]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:27 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Dec 02 23:30:27 compute-1 systemd[1]: session-16.scope: Consumed 3.638s CPU time.
Dec 02 23:30:27 compute-1 systemd-logind[790]: Session 16 logged out. Waiting for processes to exit.
Dec 02 23:30:27 compute-1 systemd-logind[790]: Removed session 16.
Dec 02 23:30:33 compute-1 sshd-session[75149]: Accepted publickey for zuul from 192.168.122.30 port 34714 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:33 compute-1 systemd-logind[790]: New session 17 of user zuul.
Dec 02 23:30:33 compute-1 systemd[1]: Started Session 17 of User zuul.
Dec 02 23:30:33 compute-1 sshd-session[75149]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:30:34 compute-1 python3.9[75302]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:30:35 compute-1 sudo[75456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fppssfemviyeymdniyzmrxndecdfdonc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718234.629368-45-146106361141438/AnsiballZ_systemd.py'
Dec 02 23:30:35 compute-1 sudo[75456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:35 compute-1 python3.9[75458]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 23:30:35 compute-1 sudo[75456]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:36 compute-1 sudo[75610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsvqehgcnxahqfukoujlcwykxitpsfko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718235.7934637-61-153712175016354/AnsiballZ_systemd.py'
Dec 02 23:30:36 compute-1 sudo[75610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:36 compute-1 python3.9[75612]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:30:36 compute-1 sudo[75610]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:37 compute-1 sudo[75763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxgzhexchjoihusyhvsdmdhupjswzqjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718236.732925-79-110392130411129/AnsiballZ_command.py'
Dec 02 23:30:37 compute-1 sudo[75763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:37 compute-1 python3.9[75765]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:37 compute-1 sudo[75763]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:38 compute-1 sudo[75916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulotwkoaugjvywtdsltefnbbxxfzghjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718237.6150284-95-242735045333640/AnsiballZ_stat.py'
Dec 02 23:30:38 compute-1 sudo[75916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:38 compute-1 python3.9[75918]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:38 compute-1 sudo[75916]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:38 compute-1 sudo[76070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvkdpapswwfvqubgeuiqfkqmuijuaiya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718238.414078-111-204286819702093/AnsiballZ_command.py'
Dec 02 23:30:38 compute-1 sudo[76070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:38 compute-1 python3.9[76072]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:38 compute-1 sudo[76070]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:39 compute-1 sudo[76225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcsqrhbudwdryagnurlqrxrwxquwibyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718239.184632-127-225175726214686/AnsiballZ_file.py'
Dec 02 23:30:39 compute-1 sudo[76225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:39 compute-1 python3.9[76227]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:39 compute-1 sudo[76225]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:40 compute-1 sshd-session[75152]: Connection closed by 192.168.122.30 port 34714
Dec 02 23:30:40 compute-1 sshd-session[75149]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:40 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Dec 02 23:30:40 compute-1 systemd[1]: session-17.scope: Consumed 4.673s CPU time.
Dec 02 23:30:40 compute-1 systemd-logind[790]: Session 17 logged out. Waiting for processes to exit.
Dec 02 23:30:40 compute-1 systemd-logind[790]: Removed session 17.
Dec 02 23:30:45 compute-1 sshd-session[76252]: Accepted publickey for zuul from 192.168.122.30 port 48288 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:45 compute-1 systemd-logind[790]: New session 18 of user zuul.
Dec 02 23:30:45 compute-1 systemd[1]: Started Session 18 of User zuul.
Dec 02 23:30:45 compute-1 sshd-session[76252]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:30:46 compute-1 python3.9[76405]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:30:47 compute-1 sudo[76559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrfyuhgiwexhhcjpkfomzmzctkhvbaga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718246.982854-49-40805782023911/AnsiballZ_setup.py'
Dec 02 23:30:47 compute-1 sudo[76559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:47 compute-1 python3.9[76561]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:30:47 compute-1 sudo[76559]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:48 compute-1 sudo[76643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlaisdncjvqzggrlwdwsvsknybbxjxzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718246.982854-49-40805782023911/AnsiballZ_dnf.py'
Dec 02 23:30:48 compute-1 sudo[76643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:48 compute-1 python3.9[76645]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:30:49 compute-1 sudo[76643]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:50 compute-1 python3.9[76796]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:51 compute-1 python3.9[76947]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:30:52 compute-1 python3.9[77097]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:53 compute-1 python3.9[77247]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:53 compute-1 sshd-session[76255]: Connection closed by 192.168.122.30 port 48288
Dec 02 23:30:53 compute-1 sshd-session[76252]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:53 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Dec 02 23:30:53 compute-1 systemd[1]: session-18.scope: Consumed 5.782s CPU time.
Dec 02 23:30:53 compute-1 systemd-logind[790]: Session 18 logged out. Waiting for processes to exit.
Dec 02 23:30:53 compute-1 systemd-logind[790]: Removed session 18.
Dec 02 23:30:59 compute-1 sshd-session[77272]: Accepted publickey for zuul from 192.168.122.30 port 56052 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:59 compute-1 systemd-logind[790]: New session 19 of user zuul.
Dec 02 23:30:59 compute-1 systemd[1]: Started Session 19 of User zuul.
Dec 02 23:30:59 compute-1 sshd-session[77272]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:31:00 compute-1 python3.9[77425]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:31:02 compute-1 sudo[77579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vydnplvzmvgdtyejkwtyzusslpazfkde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718261.766721-79-183420556478261/AnsiballZ_file.py'
Dec 02 23:31:02 compute-1 sudo[77579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:02 compute-1 python3.9[77581]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:02 compute-1 sudo[77579]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:02 compute-1 sudo[77731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifzaabqsohrbvhgpeqxjgnlypofbjhgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718262.5544972-79-93837287316342/AnsiballZ_file.py'
Dec 02 23:31:02 compute-1 sudo[77731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:03 compute-1 python3.9[77733]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:03 compute-1 sudo[77731]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:03 compute-1 sudo[77883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfvkacdrhmcdsicmfueyeujnxkbmibo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718263.2342744-109-259241596004177/AnsiballZ_stat.py'
Dec 02 23:31:03 compute-1 sudo[77883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:03 compute-1 python3.9[77885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:03 compute-1 sudo[77883]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:04 compute-1 sudo[78006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmbtzrwejrlcjcbjfrcbfoyoqvcrczzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718263.2342744-109-259241596004177/AnsiballZ_copy.py'
Dec 02 23:31:04 compute-1 sudo[78006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:04 compute-1 python3.9[78008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718263.2342744-109-259241596004177/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=34946580cd8e44fa29bcba6e54b9780b9996b8dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:04 compute-1 sudo[78006]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:05 compute-1 sudo[78158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxaihvhbypbdsgajvmihksplvbewickz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718264.9199235-109-275580685511362/AnsiballZ_stat.py'
Dec 02 23:31:05 compute-1 sudo[78158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:05 compute-1 python3.9[78160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:05 compute-1 sudo[78158]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:05 compute-1 sudo[78283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxzsthmdtzcrgaoagpfagiybmfakzxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718264.9199235-109-275580685511362/AnsiballZ_copy.py'
Dec 02 23:31:05 compute-1 sudo[78283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:06 compute-1 python3.9[78285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718264.9199235-109-275580685511362/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=6067aac5ab79e06195616248c156299111d0656b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:06 compute-1 sudo[78283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:06 compute-1 sudo[78435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbgghrntoabslyfskbpghusmdrwrygfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718266.3408864-109-200536899644181/AnsiballZ_stat.py'
Dec 02 23:31:06 compute-1 sudo[78435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:06 compute-1 python3.9[78437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:06 compute-1 sudo[78435]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:07 compute-1 sudo[78558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbzksvwfofaqlldrdiqnhzdxdkgtaxlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718266.3408864-109-200536899644181/AnsiballZ_copy.py'
Dec 02 23:31:07 compute-1 sudo[78558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:07 compute-1 python3.9[78560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718266.3408864-109-200536899644181/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=21858face43ebbb470668f67b2c1b7085ff89555 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:07 compute-1 sudo[78558]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:07 compute-1 sshd-session[78254]: Received disconnect from 117.5.148.56 port 43836:11:  [preauth]
Dec 02 23:31:07 compute-1 sshd-session[78254]: Disconnected from authenticating user root 117.5.148.56 port 43836 [preauth]
Dec 02 23:31:07 compute-1 sudo[78710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awrtmssynxcfgglkcqkqgrvptciiywhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718267.6300073-203-101173764084150/AnsiballZ_file.py'
Dec 02 23:31:07 compute-1 sudo[78710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:08 compute-1 python3.9[78712]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:08 compute-1 sudo[78710]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:08 compute-1 sudo[78862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htiiztoztgbyovuapolicyglainbeayc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718268.3215463-203-180540069134219/AnsiballZ_file.py'
Dec 02 23:31:08 compute-1 sudo[78862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:08 compute-1 python3.9[78864]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:08 compute-1 sudo[78862]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:09 compute-1 sudo[79014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezezkxrniexeibvxkhvcrblaegluzmjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718268.9779005-235-251748046396384/AnsiballZ_stat.py'
Dec 02 23:31:09 compute-1 sudo[79014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:09 compute-1 python3.9[79016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:09 compute-1 sudo[79014]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:09 compute-1 sudo[79137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivffqvyjivcqbprheomaftpjttswykjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718268.9779005-235-251748046396384/AnsiballZ_copy.py'
Dec 02 23:31:09 compute-1 sudo[79137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:09 compute-1 python3.9[79139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718268.9779005-235-251748046396384/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c53fa9dc6724ab463cea3adb751e9d746ab891a2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:10 compute-1 sudo[79137]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:10 compute-1 sudo[79289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbiiooqgzvltwokhfugcpbeevvqxumul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718270.2698739-235-115309479517454/AnsiballZ_stat.py'
Dec 02 23:31:10 compute-1 sudo[79289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:10 compute-1 python3.9[79291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:10 compute-1 sudo[79289]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:11 compute-1 sudo[79412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzeckzpcfveadjaonyvsnygjedtdrvbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718270.2698739-235-115309479517454/AnsiballZ_copy.py'
Dec 02 23:31:11 compute-1 sudo[79412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:11 compute-1 python3.9[79414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718270.2698739-235-115309479517454/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b65b6f9d57bc766f19cb07712d4556c236316680 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:11 compute-1 sudo[79412]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:11 compute-1 sudo[79564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boqeeixdubhukfnsorinhpcgkuxafzjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718271.3341727-235-236388573706216/AnsiballZ_stat.py'
Dec 02 23:31:11 compute-1 sudo[79564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:11 compute-1 python3.9[79566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:11 compute-1 sudo[79564]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:12 compute-1 sudo[79687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfbctqiyvxkzycdkaknvyymdfnsbfjnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718271.3341727-235-236388573706216/AnsiballZ_copy.py'
Dec 02 23:31:12 compute-1 sudo[79687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:12 compute-1 python3.9[79689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718271.3341727-235-236388573706216/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=613b9aa0729d922606d3efdd29a8b6e8577782be backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:12 compute-1 sudo[79687]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:12 compute-1 sudo[79839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaqdvqqhicpmsfgxdrhkesxmctvhgghj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718272.5558228-319-244369153697611/AnsiballZ_file.py'
Dec 02 23:31:12 compute-1 sudo[79839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:13 compute-1 python3.9[79841]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:13 compute-1 sudo[79839]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:13 compute-1 sudo[79991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhhxldsihvzihmtcavtkrbzuzafdapaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718273.20375-319-262684518587129/AnsiballZ_file.py'
Dec 02 23:31:13 compute-1 sudo[79991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:13 compute-1 python3.9[79993]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:13 compute-1 sudo[79991]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:14 compute-1 sudo[80143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjmesrlgzlmpdzxmrgflvtzlyaijnqqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718273.816635-346-233782616611875/AnsiballZ_stat.py'
Dec 02 23:31:14 compute-1 sudo[80143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:14 compute-1 python3.9[80145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:14 compute-1 sudo[80143]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:14 compute-1 sudo[80266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyhffixygshwwxfouqcwbalbilophhgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718273.816635-346-233782616611875/AnsiballZ_copy.py'
Dec 02 23:31:14 compute-1 sudo[80266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:15 compute-1 python3.9[80268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718273.816635-346-233782616611875/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=8edc5d82397dbf3bec37a0fa9a8b452bc55df9cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:15 compute-1 sudo[80266]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:15 compute-1 sudo[80418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdjxoxtibzatijmkplhjpqoxtpfocezq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718275.2410955-346-134966389210452/AnsiballZ_stat.py'
Dec 02 23:31:15 compute-1 sudo[80418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:15 compute-1 python3.9[80420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:15 compute-1 sudo[80418]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:16 compute-1 sudo[80541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhcsfmujwjynifclqesiydjrigwwwtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718275.2410955-346-134966389210452/AnsiballZ_copy.py'
Dec 02 23:31:16 compute-1 sudo[80541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:16 compute-1 python3.9[80543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718275.2410955-346-134966389210452/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=41a15d23d08d17b0ccb97c2ef18e9b5ee7bff7e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:16 compute-1 sudo[80541]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:17 compute-1 sudo[80693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jodkqkmwgredaktrhmvyhrqwtvqhkrwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718276.8110538-346-169437602788009/AnsiballZ_stat.py'
Dec 02 23:31:17 compute-1 sudo[80693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:17 compute-1 python3.9[80695]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:17 compute-1 sudo[80693]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:17 compute-1 sudo[80816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwktdvbfkngatzjokbwjcueiotweqoqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718276.8110538-346-169437602788009/AnsiballZ_copy.py'
Dec 02 23:31:17 compute-1 sudo[80816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:17 compute-1 python3.9[80818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718276.8110538-346-169437602788009/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a99fb9ab0dd5f787d51d331c4f386a9e489477a3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:18 compute-1 sudo[80816]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:18 compute-1 sudo[80968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyhlgnqlbvhitgiwzwisvbhooaixvglk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718278.2831802-442-140202584422125/AnsiballZ_file.py'
Dec 02 23:31:18 compute-1 sudo[80968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:18 compute-1 python3.9[80970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:18 compute-1 sudo[80968]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:19 compute-1 sudo[81120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eutcsjcldecdmhdcdvcwkjkvdjswvxae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718279.0279713-442-272765227509769/AnsiballZ_file.py'
Dec 02 23:31:19 compute-1 sudo[81120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:19 compute-1 python3.9[81122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:19 compute-1 sudo[81120]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:20 compute-1 sudo[81272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvjovufbxkcmbtijdtdlxzijpabigiea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718279.7542524-477-37115689994927/AnsiballZ_stat.py'
Dec 02 23:31:20 compute-1 sudo[81272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:20 compute-1 python3.9[81274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:20 compute-1 sudo[81272]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:20 compute-1 sudo[81395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpurymgweivrsafljlgaddoektiajcca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718279.7542524-477-37115689994927/AnsiballZ_copy.py'
Dec 02 23:31:20 compute-1 sudo[81395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:20 compute-1 python3.9[81397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718279.7542524-477-37115689994927/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=4fa0d9ea4d9c0ed83f07a5af3461f7f571cc6702 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:20 compute-1 sudo[81395]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:20 compute-1 chronyd[64990]: Selected source 23.159.16.194 (pool.ntp.org)
Dec 02 23:31:21 compute-1 sudo[81547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqgoblwhbozbompxgwhtdmzketbylsql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718281.0842853-477-70060022026457/AnsiballZ_stat.py'
Dec 02 23:31:21 compute-1 sudo[81547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:21 compute-1 python3.9[81549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:21 compute-1 sudo[81547]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:22 compute-1 sudo[81670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqoszhxbrcrqynemtbbuttvgdnqjjwvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718281.0842853-477-70060022026457/AnsiballZ_copy.py'
Dec 02 23:31:22 compute-1 sudo[81670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:22 compute-1 python3.9[81672]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718281.0842853-477-70060022026457/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=41a15d23d08d17b0ccb97c2ef18e9b5ee7bff7e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:22 compute-1 sudo[81670]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:22 compute-1 sudo[81822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owaoydpvkgpanuifzqkuuyiseycndisf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718282.4543993-477-121442508158758/AnsiballZ_stat.py'
Dec 02 23:31:22 compute-1 sudo[81822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:22 compute-1 python3.9[81824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:22 compute-1 sudo[81822]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:23 compute-1 sudo[81945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufozvuwgftrlvgdxgpfcjdzopfnwqjvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718282.4543993-477-121442508158758/AnsiballZ_copy.py'
Dec 02 23:31:23 compute-1 sudo[81945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:23 compute-1 python3.9[81947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718282.4543993-477-121442508158758/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=e1c80b29a48c49b15ee2f6485519f355b6b23439 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:23 compute-1 sudo[81945]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:24 compute-1 sudo[82097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcxzniuyfdrsdqjxhvrdvowxxccqyrjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718284.212674-592-18639781430584/AnsiballZ_file.py'
Dec 02 23:31:24 compute-1 sudo[82097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:24 compute-1 python3.9[82099]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:24 compute-1 sudo[82097]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:25 compute-1 sudo[82249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjyjkwtdjjefrhsocdjgraedfdypsczt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718284.8159523-609-235411723339313/AnsiballZ_stat.py'
Dec 02 23:31:25 compute-1 sudo[82249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:25 compute-1 python3.9[82251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:25 compute-1 sudo[82249]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:25 compute-1 sudo[82372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckeiqmplwoejqieraoqhffmfhjfumcae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718284.8159523-609-235411723339313/AnsiballZ_copy.py'
Dec 02 23:31:25 compute-1 sudo[82372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:26 compute-1 python3.9[82374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718284.8159523-609-235411723339313/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:26 compute-1 sudo[82372]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:26 compute-1 sudo[82524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fftbgwlhggvlgqhlrcavrcjwyqcoxhtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718286.3452082-645-276528189627592/AnsiballZ_file.py'
Dec 02 23:31:26 compute-1 sudo[82524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:26 compute-1 python3.9[82526]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:26 compute-1 sudo[82524]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:27 compute-1 sudo[82676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbjrzokthkuvslhuxkwjvdafwsgqwghg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718287.1813078-662-268368567302029/AnsiballZ_stat.py'
Dec 02 23:31:27 compute-1 sudo[82676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:27 compute-1 python3.9[82678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:27 compute-1 sudo[82676]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:28 compute-1 sudo[82799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmybqkavcxmabqeofuqndkkkuzivuctu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718287.1813078-662-268368567302029/AnsiballZ_copy.py'
Dec 02 23:31:28 compute-1 sudo[82799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:28 compute-1 python3.9[82801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718287.1813078-662-268368567302029/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:28 compute-1 sudo[82799]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:29 compute-1 sudo[82951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icoaxjsmvnecvmhpqqhrqdzfnfywhkmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718288.875162-695-197812161107324/AnsiballZ_file.py'
Dec 02 23:31:29 compute-1 sudo[82951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:29 compute-1 python3.9[82953]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:29 compute-1 sudo[82951]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:30 compute-1 sudo[83103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoplupcwpdojzkeurqvbxjxixajtnsec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718289.6779585-710-192348460485011/AnsiballZ_stat.py'
Dec 02 23:31:30 compute-1 sudo[83103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:30 compute-1 python3.9[83105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:30 compute-1 sudo[83103]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:30 compute-1 sudo[83226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxzqaqjdjqpgpfdxvclfbyqrhdpiyga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718289.6779585-710-192348460485011/AnsiballZ_copy.py'
Dec 02 23:31:30 compute-1 sudo[83226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:30 compute-1 python3.9[83228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718289.6779585-710-192348460485011/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:30 compute-1 sudo[83226]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:31 compute-1 sudo[83378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdanvizandvpvcgpvyokspgxjovcfxor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718291.1030304-743-30852924349901/AnsiballZ_file.py'
Dec 02 23:31:31 compute-1 sudo[83378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:31 compute-1 python3.9[83380]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:31 compute-1 sudo[83378]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:32 compute-1 sudo[83530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivrkbicamgsghmxjxbfmrdjgzpjjrgfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718291.8281476-759-29753492486987/AnsiballZ_stat.py'
Dec 02 23:31:32 compute-1 sudo[83530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:32 compute-1 python3.9[83532]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:32 compute-1 sudo[83530]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:32 compute-1 sudo[83653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqzlgazvnuuievqfkbisulsucutglhlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718291.8281476-759-29753492486987/AnsiballZ_copy.py'
Dec 02 23:31:32 compute-1 sudo[83653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:33 compute-1 python3.9[83655]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718291.8281476-759-29753492486987/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:33 compute-1 sudo[83653]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:33 compute-1 sudo[83805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqmwnccxgnpdvytodlrulicvbanvibwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718293.3041973-793-168897161034792/AnsiballZ_file.py'
Dec 02 23:31:33 compute-1 sudo[83805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:33 compute-1 python3.9[83807]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:33 compute-1 sudo[83805]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:34 compute-1 sudo[83957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaczsymewhnzoannfpiubrlstopmvdwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718294.1048796-811-144585591252839/AnsiballZ_stat.py'
Dec 02 23:31:34 compute-1 sudo[83957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:34 compute-1 python3.9[83959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:34 compute-1 sudo[83957]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:35 compute-1 sudo[84080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdfwjolieipkegbzzeidsocppjsbkdae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718294.1048796-811-144585591252839/AnsiballZ_copy.py'
Dec 02 23:31:35 compute-1 sudo[84080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:35 compute-1 python3.9[84082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718294.1048796-811-144585591252839/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:35 compute-1 sudo[84080]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:35 compute-1 sudo[84232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxaylywyqlkgsxponrubfxpxsqyudcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718295.6626914-845-231214699620061/AnsiballZ_file.py'
Dec 02 23:31:35 compute-1 sudo[84232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:36 compute-1 python3.9[84234]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:36 compute-1 sudo[84232]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:36 compute-1 sudo[84384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlsprzyfovtcpxibrhujjtqdjlgoucuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718296.42331-859-276577979594903/AnsiballZ_stat.py'
Dec 02 23:31:36 compute-1 sudo[84384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:36 compute-1 python3.9[84386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:36 compute-1 sudo[84384]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:37 compute-1 sudo[84507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gizvwkrzhepyjgtddwybfniyopvcinvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718296.42331-859-276577979594903/AnsiballZ_copy.py'
Dec 02 23:31:37 compute-1 sudo[84507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:37 compute-1 python3.9[84509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718296.42331-859-276577979594903/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:37 compute-1 sudo[84507]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:38 compute-1 sudo[84659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auryzzwpkivykclzsaumntqwwqplzkre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718297.895574-893-123316337927084/AnsiballZ_file.py'
Dec 02 23:31:38 compute-1 sudo[84659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:38 compute-1 python3.9[84661]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:38 compute-1 sudo[84659]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:39 compute-1 sudo[84811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jleseetkcnwlpxuhabvbvtkryrlhzzab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718298.6324852-903-201956306406812/AnsiballZ_stat.py'
Dec 02 23:31:39 compute-1 sudo[84811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:39 compute-1 python3.9[84813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:39 compute-1 sudo[84811]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:39 compute-1 sudo[84934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxujytzmbvgvggxgnhjphoyokpesbtiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718298.6324852-903-201956306406812/AnsiballZ_copy.py'
Dec 02 23:31:39 compute-1 sudo[84934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:39 compute-1 python3.9[84936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718298.6324852-903-201956306406812/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:39 compute-1 sudo[84934]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:40 compute-1 sshd-session[77275]: Connection closed by 192.168.122.30 port 56052
Dec 02 23:31:40 compute-1 sshd-session[77272]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:31:40 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Dec 02 23:31:40 compute-1 systemd[1]: session-19.scope: Consumed 31.413s CPU time.
Dec 02 23:31:40 compute-1 systemd-logind[790]: Session 19 logged out. Waiting for processes to exit.
Dec 02 23:31:40 compute-1 systemd-logind[790]: Removed session 19.
Dec 02 23:31:46 compute-1 sshd-session[84961]: Accepted publickey for zuul from 192.168.122.30 port 42204 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:31:46 compute-1 systemd-logind[790]: New session 20 of user zuul.
Dec 02 23:31:46 compute-1 systemd[1]: Started Session 20 of User zuul.
Dec 02 23:31:46 compute-1 sshd-session[84961]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:31:47 compute-1 python3.9[85114]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:31:48 compute-1 sudo[85268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-silhcsxjrhislygzesydgssfkpymaodn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718307.8974333-49-192217666230722/AnsiballZ_file.py'
Dec 02 23:31:48 compute-1 sudo[85268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:48 compute-1 python3.9[85270]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:48 compute-1 sudo[85268]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:49 compute-1 sudo[85420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfhecavivhmccrzptcqcwntbtmfeorwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718308.9823089-49-54571787880865/AnsiballZ_file.py'
Dec 02 23:31:49 compute-1 sudo[85420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:49 compute-1 python3.9[85422]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:49 compute-1 sudo[85420]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:50 compute-1 python3.9[85572]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:31:51 compute-1 sudo[85722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sejjzgqjhxxwjspextrjvrzjmneoyjly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718310.9540622-95-255772612980568/AnsiballZ_seboolean.py'
Dec 02 23:31:51 compute-1 sudo[85722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:51 compute-1 python3.9[85724]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 23:31:52 compute-1 sudo[85722]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:53 compute-1 sudo[85878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktfklmtzhhnwsuxxngmdugbdlufcpmfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718313.0845215-115-220956410204584/AnsiballZ_setup.py'
Dec 02 23:31:53 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 02 23:31:53 compute-1 sudo[85878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:53 compute-1 python3.9[85880]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:31:53 compute-1 sudo[85878]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:54 compute-1 sudo[85962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oahwbngfvhiicyaspphzbsnhpvqagrjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718313.0845215-115-220956410204584/AnsiballZ_dnf.py'
Dec 02 23:31:54 compute-1 sudo[85962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:54 compute-1 python3.9[85964]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:31:55 compute-1 sudo[85962]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:56 compute-1 sudo[86115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjgijcbidgmridppisinwyubrlbuucvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718316.1502154-139-352038716207/AnsiballZ_systemd.py'
Dec 02 23:31:56 compute-1 sudo[86115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:57 compute-1 python3.9[86117]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:31:57 compute-1 sudo[86115]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:57 compute-1 sudo[86270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goaarrrvhghqnfgtngszdnkcigsunbss ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718317.466468-155-8894642417821/AnsiballZ_edpm_nftables_snippet.py'
Dec 02 23:31:57 compute-1 sudo[86270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:58 compute-1 python3[86272]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 02 23:31:58 compute-1 sudo[86270]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:58 compute-1 sudo[86422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgyvgzoeprlimyjrjuafkkglasgzodv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718318.4903817-173-66300133248992/AnsiballZ_file.py'
Dec 02 23:31:58 compute-1 sudo[86422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:59 compute-1 python3.9[86424]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:59 compute-1 sudo[86422]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:59 compute-1 sudo[86574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahfvcnjmukzjkwydbstjubsymjpkbijs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718319.3063564-189-8631177675466/AnsiballZ_stat.py'
Dec 02 23:31:59 compute-1 sudo[86574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:00 compute-1 python3.9[86576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:00 compute-1 sudo[86574]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:00 compute-1 sudo[86652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sucaatbvwxxncklgynvneararbwlpala ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718319.3063564-189-8631177675466/AnsiballZ_file.py'
Dec 02 23:32:00 compute-1 sudo[86652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:00 compute-1 python3.9[86654]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:00 compute-1 sudo[86652]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:01 compute-1 sudo[86804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqclayqqiqsaqkwzedgrpjlhrxxnemzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718320.9237154-213-82813797860313/AnsiballZ_stat.py'
Dec 02 23:32:01 compute-1 sudo[86804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:01 compute-1 python3.9[86806]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:01 compute-1 sudo[86804]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:01 compute-1 sudo[86882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guuwijmpqtwasgpokfqoogxvwdzjakpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718320.9237154-213-82813797860313/AnsiballZ_file.py'
Dec 02 23:32:01 compute-1 sudo[86882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:02 compute-1 python3.9[86884]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.070h1ck6 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:02 compute-1 sudo[86882]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:02 compute-1 sudo[87034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpdbyaabvltnleasczejlqaddrawqzxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718322.3042896-237-136938668756891/AnsiballZ_stat.py'
Dec 02 23:32:02 compute-1 sudo[87034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:02 compute-1 python3.9[87036]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:02 compute-1 sudo[87034]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:03 compute-1 sudo[87112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hipwduderacyrvlzsddleobukwmciqwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718322.3042896-237-136938668756891/AnsiballZ_file.py'
Dec 02 23:32:03 compute-1 sudo[87112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:03 compute-1 python3.9[87114]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:03 compute-1 sudo[87112]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:04 compute-1 sudo[87264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbnhpynzyntqtgrwjkpmyyxqdgenhbjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718323.6918712-263-33256628967808/AnsiballZ_command.py'
Dec 02 23:32:04 compute-1 sudo[87264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:04 compute-1 python3.9[87266]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:04 compute-1 sudo[87264]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:05 compute-1 sudo[87417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blpkyfqzasxilkkzhtxwvbfmxwyhmftl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718324.6546972-279-141509180776948/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:32:05 compute-1 sudo[87417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:05 compute-1 python3[87419]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:32:05 compute-1 sudo[87417]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:06 compute-1 sudo[87569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouiazpaiiksduoonlpisphkquazbxxjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718325.6422799-295-222662174277195/AnsiballZ_stat.py'
Dec 02 23:32:06 compute-1 sudo[87569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:06 compute-1 python3.9[87571]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:06 compute-1 sudo[87569]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:06 compute-1 sudo[87694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkhzwowzmmzcwyrhehssenoskmiwymke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718325.6422799-295-222662174277195/AnsiballZ_copy.py'
Dec 02 23:32:06 compute-1 sudo[87694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:07 compute-1 python3.9[87696]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718325.6422799-295-222662174277195/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:07 compute-1 sudo[87694]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:07 compute-1 sudo[87846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsznuylsykfszxmjfweqfplbhurbgja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718327.3287861-325-153496703780116/AnsiballZ_stat.py'
Dec 02 23:32:07 compute-1 sudo[87846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:07 compute-1 python3.9[87848]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:07 compute-1 sudo[87846]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:08 compute-1 sudo[87972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlrnucfaqkiflgnqkcshihzceqcpvedk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718327.3287861-325-153496703780116/AnsiballZ_copy.py'
Dec 02 23:32:08 compute-1 sudo[87972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:08 compute-1 python3.9[87974]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718327.3287861-325-153496703780116/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:08 compute-1 sudo[87972]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:09 compute-1 sudo[88124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cicwwyrijymmjsmsojrmdkqtnlutemtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718328.8461428-355-184953911177656/AnsiballZ_stat.py'
Dec 02 23:32:09 compute-1 sudo[88124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:09 compute-1 python3.9[88126]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:09 compute-1 sudo[88124]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:09 compute-1 sudo[88249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxpmwveruhrvxoxuxkuztzogiscljjzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718328.8461428-355-184953911177656/AnsiballZ_copy.py'
Dec 02 23:32:09 compute-1 sudo[88249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:10 compute-1 python3.9[88251]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718328.8461428-355-184953911177656/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:10 compute-1 sudo[88249]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:10 compute-1 sudo[88401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slmimajpudsmmnyclfrhvtufwiqcqujw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718330.3485365-385-30043314085444/AnsiballZ_stat.py'
Dec 02 23:32:10 compute-1 sudo[88401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:10 compute-1 python3.9[88403]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:10 compute-1 sudo[88401]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:11 compute-1 sudo[88526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvstmjhhlbynmdcifrqfofwrmlhwapmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718330.3485365-385-30043314085444/AnsiballZ_copy.py'
Dec 02 23:32:11 compute-1 sudo[88526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:11 compute-1 python3.9[88528]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718330.3485365-385-30043314085444/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:11 compute-1 sudo[88526]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:12 compute-1 sudo[88678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwyrqdzydlvrrqcmnczhejudhsxbmiax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718331.8697023-415-265665691083332/AnsiballZ_stat.py'
Dec 02 23:32:12 compute-1 sudo[88678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:12 compute-1 python3.9[88680]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:12 compute-1 sudo[88678]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:13 compute-1 sudo[88803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elcouirdpozhgncunnzepckuujboqtwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718331.8697023-415-265665691083332/AnsiballZ_copy.py'
Dec 02 23:32:13 compute-1 sudo[88803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:13 compute-1 python3.9[88805]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718331.8697023-415-265665691083332/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:13 compute-1 sudo[88803]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:13 compute-1 sudo[88955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnmwxyxfzvujassvauqbycegiyibunwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718333.441329-445-39931642527664/AnsiballZ_file.py'
Dec 02 23:32:13 compute-1 sudo[88955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:14 compute-1 python3.9[88957]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:14 compute-1 sudo[88955]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:14 compute-1 sudo[89107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynsucuhglhoiwbnmegmhsmcxtjxhzuoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718334.3073337-461-52914668376953/AnsiballZ_command.py'
Dec 02 23:32:14 compute-1 sudo[89107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:14 compute-1 python3.9[89109]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:15 compute-1 sudo[89107]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:15 compute-1 sudo[89262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iontwfkfyseregnjdyhlxrjexmnphune ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718335.229317-477-173209792681098/AnsiballZ_blockinfile.py'
Dec 02 23:32:15 compute-1 sudo[89262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:15 compute-1 python3.9[89264]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:16 compute-1 sudo[89262]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:16 compute-1 sudo[89414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbxlfmumencbyfwkgdibpyosnyzssfcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718336.3376012-495-51610629375406/AnsiballZ_command.py'
Dec 02 23:32:16 compute-1 sudo[89414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:16 compute-1 python3.9[89416]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:16 compute-1 sudo[89414]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:17 compute-1 sudo[89567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifbtfavcawkeczakqjgjemqfxwpqqdxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718337.163665-511-65257324369244/AnsiballZ_stat.py'
Dec 02 23:32:17 compute-1 sudo[89567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:17 compute-1 python3.9[89569]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:17 compute-1 sudo[89567]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:18 compute-1 sudo[89721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpbdmsrpxjbqxtbvxegbgqvbyzganttp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718337.9431424-527-165786690029981/AnsiballZ_command.py'
Dec 02 23:32:18 compute-1 sudo[89721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:18 compute-1 python3.9[89723]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:18 compute-1 sudo[89721]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:19 compute-1 sudo[89876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvpbwghlggdfknrtpbabcvvlgstxhskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718338.7979348-543-181866703991142/AnsiballZ_file.py'
Dec 02 23:32:19 compute-1 sudo[89876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:19 compute-1 python3.9[89878]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:19 compute-1 sudo[89876]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:20 compute-1 python3.9[90028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:32:21 compute-1 sudo[90179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsnepctapbkkzaljgfybmkwatshoxmfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718341.5584948-623-273019972338972/AnsiballZ_command.py'
Dec 02 23:32:21 compute-1 sudo[90179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:22 compute-1 python3.9[90181]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:22 compute-1 ovs-vsctl[90182]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 02 23:32:22 compute-1 sudo[90179]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:22 compute-1 sudo[90332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgbkfxjdqqvvrsbnclyurkqwfziilgwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718342.4491906-641-82953384248149/AnsiballZ_command.py'
Dec 02 23:32:22 compute-1 sudo[90332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:23 compute-1 python3.9[90334]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:23 compute-1 sudo[90332]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:23 compute-1 sudo[90487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsncydlzowmfriipkoyjefetbprydzwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718343.2340257-657-157431551890357/AnsiballZ_command.py'
Dec 02 23:32:23 compute-1 sudo[90487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:23 compute-1 python3.9[90489]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:23 compute-1 ovs-vsctl[90490]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 02 23:32:23 compute-1 sudo[90487]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:24 compute-1 python3.9[90640]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:25 compute-1 sudo[90792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtbvlichbusklsckpnhnmeamkyythzda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718344.986109-691-26683623334075/AnsiballZ_file.py'
Dec 02 23:32:25 compute-1 sudo[90792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:25 compute-1 python3.9[90794]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:25 compute-1 sudo[90792]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:26 compute-1 sudo[90944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iivgxprvrigsuqispdcrmofdzulpukhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718345.852348-707-182190883772334/AnsiballZ_stat.py'
Dec 02 23:32:26 compute-1 sudo[90944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:26 compute-1 python3.9[90946]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:26 compute-1 sudo[90944]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:26 compute-1 sudo[91022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aobzimvjtqynwdlgnztvgdqcwipsnfow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718345.852348-707-182190883772334/AnsiballZ_file.py'
Dec 02 23:32:26 compute-1 sudo[91022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:26 compute-1 python3.9[91024]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:27 compute-1 sudo[91022]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:27 compute-1 sudo[91174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wplzmhwabljdyonimxmypbspusokzryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718347.1655798-707-262492799235470/AnsiballZ_stat.py'
Dec 02 23:32:27 compute-1 sudo[91174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:27 compute-1 python3.9[91176]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:27 compute-1 sudo[91174]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:27 compute-1 sudo[91252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjoakeqdretclwyppjigwrwtgujkajng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718347.1655798-707-262492799235470/AnsiballZ_file.py'
Dec 02 23:32:27 compute-1 sudo[91252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:28 compute-1 python3.9[91254]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:28 compute-1 sudo[91252]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:28 compute-1 sudo[91404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpmfcxskmaonsomvdfyloyugeuwtlawt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718348.4321969-753-120161123250469/AnsiballZ_file.py'
Dec 02 23:32:28 compute-1 sudo[91404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:29 compute-1 python3.9[91406]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:29 compute-1 sudo[91404]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:29 compute-1 sudo[91556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyebiglwthqihsxbnsbskqfbxafjypzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718349.320206-769-108576102463174/AnsiballZ_stat.py'
Dec 02 23:32:29 compute-1 sudo[91556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:29 compute-1 python3.9[91558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:29 compute-1 sudo[91556]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:30 compute-1 sudo[91634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyiguzdrqeilujpkathknbomyrejlols ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718349.320206-769-108576102463174/AnsiballZ_file.py'
Dec 02 23:32:30 compute-1 sudo[91634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:30 compute-1 python3.9[91636]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:30 compute-1 sudo[91634]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:30 compute-1 sudo[91786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybqhvgjiwgxvdbgtmfvgfpubunujrylw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718350.614228-793-260194408877432/AnsiballZ_stat.py'
Dec 02 23:32:30 compute-1 sudo[91786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:31 compute-1 python3.9[91788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:31 compute-1 sudo[91786]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:31 compute-1 sudo[91864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsnrxwllpqhlyadjtgkuzfoqbwyhmynn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718350.614228-793-260194408877432/AnsiballZ_file.py'
Dec 02 23:32:31 compute-1 sudo[91864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:31 compute-1 python3.9[91866]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:31 compute-1 sudo[91864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:32 compute-1 sudo[92016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nofsrmeuctnoosrzinpvghyqpxuhohpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718351.9749105-817-213183329870973/AnsiballZ_systemd.py'
Dec 02 23:32:32 compute-1 sudo[92016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:32 compute-1 python3.9[92018]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:32:32 compute-1 systemd[1]: Reloading.
Dec 02 23:32:32 compute-1 systemd-rc-local-generator[92044]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:32 compute-1 systemd-sysv-generator[92047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:33 compute-1 sudo[92016]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:33 compute-1 sudo[92205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbpxymbkexjifeenjisbzcajlleweize ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718353.2519772-833-105660529266318/AnsiballZ_stat.py'
Dec 02 23:32:33 compute-1 sudo[92205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:34 compute-1 python3.9[92207]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:34 compute-1 sudo[92205]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:34 compute-1 sudo[92283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouxufgwxriiexirrmenefeqkbjikxnlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718353.2519772-833-105660529266318/AnsiballZ_file.py'
Dec 02 23:32:34 compute-1 sudo[92283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:34 compute-1 python3.9[92285]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:34 compute-1 sudo[92283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:35 compute-1 sudo[92435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddoorfjbxgrmzbaxfwamelviahyuilwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718354.8454766-857-273233423174055/AnsiballZ_stat.py'
Dec 02 23:32:35 compute-1 sudo[92435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:35 compute-1 python3.9[92437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:35 compute-1 sudo[92435]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:35 compute-1 sudo[92513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjbocjmknebpzxonczcnboshiyjysaau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718354.8454766-857-273233423174055/AnsiballZ_file.py'
Dec 02 23:32:35 compute-1 sudo[92513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:35 compute-1 python3.9[92515]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:35 compute-1 sudo[92513]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:36 compute-1 sudo[92665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebdrxznlgyguockviolpxbhvtxggmirm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718356.1855555-881-68651680113538/AnsiballZ_systemd.py'
Dec 02 23:32:36 compute-1 sudo[92665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:36 compute-1 python3.9[92667]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:32:36 compute-1 systemd[1]: Reloading.
Dec 02 23:32:36 compute-1 systemd-rc-local-generator[92696]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:37 compute-1 systemd-sysv-generator[92700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:37 compute-1 systemd[1]: Starting Create netns directory...
Dec 02 23:32:37 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:32:37 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:32:37 compute-1 systemd[1]: Finished Create netns directory.
Dec 02 23:32:37 compute-1 sudo[92665]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:38 compute-1 sudo[92859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nicfkofhkhrvljseikrzntvknadcakkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718358.5548604-901-230424753313187/AnsiballZ_file.py'
Dec 02 23:32:38 compute-1 sudo[92859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:39 compute-1 python3.9[92861]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:39 compute-1 sudo[92859]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:39 compute-1 sudo[93011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opyzsjpxqxrwfcxaisansbuehkalwhmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718359.3065248-917-40420307107939/AnsiballZ_stat.py'
Dec 02 23:32:39 compute-1 sudo[93011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:39 compute-1 python3.9[93013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:39 compute-1 sudo[93011]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:40 compute-1 sudo[93134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mggjpoddyhihpcrfbcitptkyuazflfae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718359.3065248-917-40420307107939/AnsiballZ_copy.py'
Dec 02 23:32:40 compute-1 sudo[93134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:40 compute-1 python3.9[93136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718359.3065248-917-40420307107939/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:40 compute-1 sudo[93134]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:41 compute-1 sudo[93286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkqkgfplugzsimybwpydpsurjqsqaeul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718360.8871295-951-103898138157258/AnsiballZ_file.py'
Dec 02 23:32:41 compute-1 sudo[93286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:41 compute-1 python3.9[93288]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:41 compute-1 sudo[93286]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:42 compute-1 sudo[93438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhavjavdlyeoqqoxgnvpmlvazzbwkwct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718361.7037477-967-20786388366855/AnsiballZ_stat.py'
Dec 02 23:32:42 compute-1 sudo[93438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:42 compute-1 python3.9[93440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:42 compute-1 sudo[93438]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:42 compute-1 sudo[93561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbveoghnrwehgczdakbjfmrkfbjclanl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718361.7037477-967-20786388366855/AnsiballZ_copy.py'
Dec 02 23:32:42 compute-1 sudo[93561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:42 compute-1 python3.9[93563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718361.7037477-967-20786388366855/.source.json _original_basename=.5cgjpzf3 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:42 compute-1 sudo[93561]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:43 compute-1 sudo[93713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwnjdwidxpjujotkmwoksplnrrddckhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718363.1290877-997-269646022419913/AnsiballZ_file.py'
Dec 02 23:32:43 compute-1 sudo[93713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:43 compute-1 python3.9[93715]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:43 compute-1 sudo[93713]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:44 compute-1 sudo[93865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-serkgveycbyfpobdigdpdozwewwxdmle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718363.952086-1013-123396592006380/AnsiballZ_stat.py'
Dec 02 23:32:44 compute-1 sudo[93865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:44 compute-1 sudo[93865]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:44 compute-1 sudo[93988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fynymecexrzlmgvviclguehpuayksqzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718363.952086-1013-123396592006380/AnsiballZ_copy.py'
Dec 02 23:32:44 compute-1 sudo[93988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:45 compute-1 sudo[93988]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:46 compute-1 sudo[94140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpwnxmzlxjgslqjtgysjrtutaltveufl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718365.570442-1047-158118766269795/AnsiballZ_container_config_data.py'
Dec 02 23:32:46 compute-1 sudo[94140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:46 compute-1 python3.9[94142]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 02 23:32:46 compute-1 sudo[94140]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:47 compute-1 sudo[94292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbljeejeftjaenbcbwsgydjzmokvtfbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718366.5977664-1065-149916076043107/AnsiballZ_container_config_hash.py'
Dec 02 23:32:47 compute-1 sudo[94292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:47 compute-1 python3.9[94294]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:32:47 compute-1 sudo[94292]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:48 compute-1 sudo[94444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvkcwusgqgiitnnjdpimkzldpngbbrkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718367.625108-1083-42356057084396/AnsiballZ_podman_container_info.py'
Dec 02 23:32:48 compute-1 sudo[94444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:48 compute-1 python3.9[94446]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 23:32:48 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:48 compute-1 sudo[94444]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:49 compute-1 sudo[94606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkrudhwqsuwzobrzweogmmucrnhdipjq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718369.0825164-1109-250568811307343/AnsiballZ_edpm_container_manage.py'
Dec 02 23:32:49 compute-1 sudo[94606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:49 compute-1 python3[94608]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:32:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:50 compute-1 podman[94645]: 2025-12-02 23:32:50.096366761 +0000 UTC m=+0.056724131 container create a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:32:50 compute-1 podman[94645]: 2025-12-02 23:32:50.06882374 +0000 UTC m=+0.029181160 image pull 78889ae0cf8c3740f43b6df72a2c4568ab589fb816614851d476abc277d3fffb 38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Dec 02 23:32:50 compute-1 python3[94608]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Dec 02 23:32:50 compute-1 sudo[94606]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:50 compute-1 sudo[94834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-equjimadjuobspayxucxxoizohltoabu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718370.6214004-1125-182050499857482/AnsiballZ_stat.py'
Dec 02 23:32:50 compute-1 sudo[94834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:51 compute-1 python3.9[94836]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:51 compute-1 sudo[94834]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:51 compute-1 sudo[94988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odlageyzjaflayeceedsdemsmbztccjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718371.4748044-1143-70986980953598/AnsiballZ_file.py'
Dec 02 23:32:51 compute-1 sudo[94988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:51 compute-1 python3.9[94990]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:51 compute-1 sudo[94988]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:52 compute-1 sudo[95064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaajmvlfvdaqrgvtwwzqweymzaixptge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718371.4748044-1143-70986980953598/AnsiballZ_stat.py'
Dec 02 23:32:52 compute-1 sudo[95064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:52 compute-1 python3.9[95066]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:52 compute-1 sudo[95064]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:52 compute-1 sudo[95215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trpdhnawhqegbhwqsxtjfszeledbzgxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718372.5180204-1143-148590550593330/AnsiballZ_copy.py'
Dec 02 23:32:52 compute-1 sudo[95215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:53 compute-1 python3.9[95217]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718372.5180204-1143-148590550593330/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:53 compute-1 sudo[95215]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:53 compute-1 sudo[95291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyugxbpbwlidxxxtjqmvamdxqcxcdqoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718372.5180204-1143-148590550593330/AnsiballZ_systemd.py'
Dec 02 23:32:53 compute-1 sudo[95291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:53 compute-1 python3.9[95293]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:32:53 compute-1 systemd[1]: Reloading.
Dec 02 23:32:53 compute-1 systemd-sysv-generator[95327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:53 compute-1 systemd-rc-local-generator[95323]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:53 compute-1 sudo[95291]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:54 compute-1 sudo[95404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scislxvrupblbehppgbzqtwkemmkftld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718372.5180204-1143-148590550593330/AnsiballZ_systemd.py'
Dec 02 23:32:54 compute-1 sudo[95404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:54 compute-1 python3.9[95406]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:32:54 compute-1 systemd[1]: Reloading.
Dec 02 23:32:54 compute-1 systemd-rc-local-generator[95438]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:54 compute-1 systemd-sysv-generator[95442]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:54 compute-1 systemd[1]: Starting ovn_controller container...
Dec 02 23:32:55 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 02 23:32:55 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:32:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/140c7fff6b889cdcea67cb96b33a1907375d013e3e1d74386c3ecc5bec497355/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 23:32:55 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda.
Dec 02 23:32:55 compute-1 podman[95448]: 2025-12-02 23:32:55.124805542 +0000 UTC m=+0.184261695 container init a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + sudo -E kolla_set_configs
Dec 02 23:32:55 compute-1 podman[95448]: 2025-12-02 23:32:55.166451721 +0000 UTC m=+0.225907824 container start a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 23:32:55 compute-1 edpm-start-podman-container[95448]: ovn_controller
Dec 02 23:32:55 compute-1 systemd[1]: Created slice User Slice of UID 0.
Dec 02 23:32:55 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 23:32:55 compute-1 edpm-start-podman-container[95447]: Creating additional drop-in dependency for "ovn_controller" (a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda)
Dec 02 23:32:55 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 23:32:55 compute-1 podman[95471]: 2025-12-02 23:32:55.263330923 +0000 UTC m=+0.079760825 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:32:55 compute-1 systemd[1]: Starting User Manager for UID 0...
Dec 02 23:32:55 compute-1 systemd[1]: a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda-696cf8841bd1eff8.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:32:55 compute-1 systemd[1]: a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda-696cf8841bd1eff8.service: Failed with result 'exit-code'.
Dec 02 23:32:55 compute-1 systemd[1]: Reloading.
Dec 02 23:32:55 compute-1 systemd-rc-local-generator[95536]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:55 compute-1 systemd-sysv-generator[95540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:55 compute-1 systemd[95509]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 02 23:32:55 compute-1 systemd[1]: Started ovn_controller container.
Dec 02 23:32:55 compute-1 sudo[95404]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:55 compute-1 systemd[95509]: Queued start job for default target Main User Target.
Dec 02 23:32:55 compute-1 systemd[95509]: Created slice User Application Slice.
Dec 02 23:32:55 compute-1 systemd[95509]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 23:32:55 compute-1 systemd[95509]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 23:32:55 compute-1 systemd[95509]: Reached target Paths.
Dec 02 23:32:55 compute-1 systemd[95509]: Reached target Timers.
Dec 02 23:32:55 compute-1 systemd[95509]: Starting D-Bus User Message Bus Socket...
Dec 02 23:32:55 compute-1 systemd[95509]: Starting Create User's Volatile Files and Directories...
Dec 02 23:32:55 compute-1 systemd[95509]: Listening on D-Bus User Message Bus Socket.
Dec 02 23:32:55 compute-1 systemd[95509]: Reached target Sockets.
Dec 02 23:32:55 compute-1 systemd[95509]: Finished Create User's Volatile Files and Directories.
Dec 02 23:32:55 compute-1 systemd[95509]: Reached target Basic System.
Dec 02 23:32:55 compute-1 systemd[95509]: Reached target Main User Target.
Dec 02 23:32:55 compute-1 systemd[95509]: Startup finished in 171ms.
Dec 02 23:32:55 compute-1 systemd[1]: Started User Manager for UID 0.
Dec 02 23:32:55 compute-1 systemd[1]: Started Session c1 of User root.
Dec 02 23:32:55 compute-1 ovn_controller[95464]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:32:55 compute-1 ovn_controller[95464]: INFO:__main__:Validating config file
Dec 02 23:32:55 compute-1 ovn_controller[95464]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:32:55 compute-1 ovn_controller[95464]: INFO:__main__:Writing out command to execute
Dec 02 23:32:55 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 02 23:32:55 compute-1 ovn_controller[95464]: ++ cat /run_command
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + ARGS=
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + sudo kolla_copy_cacerts
Dec 02 23:32:55 compute-1 systemd[1]: Started Session c2 of User root.
Dec 02 23:32:55 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + [[ ! -n '' ]]
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + . kolla_extend_start
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 02 23:32:55 compute-1 ovn_controller[95464]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + umask 0022
Dec 02 23:32:55 compute-1 ovn_controller[95464]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 02 23:32:55 compute-1 ovn_controller[95464]: 2025-12-02T23:32:55Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 02 23:32:55 compute-1 NetworkManager[55553]: <info>  [1764718375.9312] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 02 23:32:55 compute-1 NetworkManager[55553]: <info>  [1764718375.9323] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:32:55 compute-1 NetworkManager[55553]: <info>  [1764718375.9341] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 02 23:32:55 compute-1 NetworkManager[55553]: <info>  [1764718375.9349] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 02 23:32:55 compute-1 NetworkManager[55553]: <info>  [1764718375.9354] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 02 23:32:55 compute-1 kernel: br-int: entered promiscuous mode
Dec 02 23:32:55 compute-1 systemd-udevd[95615]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:32:56 compute-1 sudo[95723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znxrpikcbqwozqjdhoywjgwbxfybtkcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718375.993758-1199-45899718388750/AnsiballZ_command.py'
Dec 02 23:32:56 compute-1 sudo[95723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:56 compute-1 python3.9[95725]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:56 compute-1 ovs-vsctl[95726]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 02 23:32:56 compute-1 sudo[95723]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00022|features|INFO|OVS Feature: meter_support, state: supported
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00023|features|INFO|OVS Feature: group_support, state: supported
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00024|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00025|features|INFO|OVS Feature: ct_flush, state: supported
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00026|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00027|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00028|main|INFO|OVS feature set changed, force recompute.
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00029|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00030|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00032|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00033|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00034|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00035|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 02 23:32:56 compute-1 ovn_controller[95464]: 2025-12-02T23:32:56Z|00036|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 02 23:32:56 compute-1 NetworkManager[55553]: <info>  [1764718376.9566] manager: (ovn-83290d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 02 23:32:56 compute-1 NetworkManager[55553]: <info>  [1764718376.9576] manager: (ovn-5cbf2a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Dec 02 23:32:56 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Dec 02 23:32:56 compute-1 NetworkManager[55553]: <info>  [1764718376.9788] device (genev_sys_6081): carrier: link connected
Dec 02 23:32:56 compute-1 NetworkManager[55553]: <info>  [1764718376.9792] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Dec 02 23:32:57 compute-1 sudo[95879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpqbmzqlwcryjnclgcvsngguedhgessp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718376.838454-1215-154730542562090/AnsiballZ_command.py'
Dec 02 23:32:57 compute-1 sudo[95879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:57 compute-1 python3.9[95881]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:57 compute-1 ovs-vsctl[95883]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 02 23:32:57 compute-1 sudo[95879]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:58 compute-1 sudo[96034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xknmixfsykndimtxtlozckymfgtuelop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718377.7902784-1243-241069731348992/AnsiballZ_command.py'
Dec 02 23:32:58 compute-1 sudo[96034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:58 compute-1 python3.9[96036]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:58 compute-1 ovs-vsctl[96037]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 02 23:32:58 compute-1 sudo[96034]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:58 compute-1 sshd-session[84964]: Connection closed by 192.168.122.30 port 42204
Dec 02 23:32:58 compute-1 sshd-session[84961]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:32:58 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Dec 02 23:32:58 compute-1 systemd[1]: session-20.scope: Consumed 53.537s CPU time.
Dec 02 23:32:58 compute-1 systemd-logind[790]: Session 20 logged out. Waiting for processes to exit.
Dec 02 23:32:58 compute-1 systemd-logind[790]: Removed session 20.
Dec 02 23:33:04 compute-1 sshd-session[96062]: Accepted publickey for zuul from 192.168.122.30 port 52380 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:33:04 compute-1 systemd-logind[790]: New session 22 of user zuul.
Dec 02 23:33:04 compute-1 systemd[1]: Started Session 22 of User zuul.
Dec 02 23:33:04 compute-1 sshd-session[96062]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:33:05 compute-1 python3.9[96215]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:33:06 compute-1 systemd[1]: Stopping User Manager for UID 0...
Dec 02 23:33:06 compute-1 systemd[95509]: Activating special unit Exit the Session...
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped target Main User Target.
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped target Basic System.
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped target Paths.
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped target Sockets.
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped target Timers.
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 23:33:06 compute-1 systemd[95509]: Closed D-Bus User Message Bus Socket.
Dec 02 23:33:06 compute-1 systemd[95509]: Stopped Create User's Volatile Files and Directories.
Dec 02 23:33:06 compute-1 systemd[95509]: Removed slice User Application Slice.
Dec 02 23:33:06 compute-1 systemd[95509]: Reached target Shutdown.
Dec 02 23:33:06 compute-1 systemd[95509]: Finished Exit the Session.
Dec 02 23:33:06 compute-1 systemd[95509]: Reached target Exit the Session.
Dec 02 23:33:06 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Dec 02 23:33:06 compute-1 systemd[1]: Stopped User Manager for UID 0.
Dec 02 23:33:06 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 23:33:06 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 23:33:06 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 23:33:06 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 23:33:06 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Dec 02 23:33:06 compute-1 sudo[96372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtorkroracgstkvgaahbjhdhvcvdznuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718385.8854098-49-64798259083061/AnsiballZ_file.py'
Dec 02 23:33:06 compute-1 sudo[96372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:06 compute-1 python3.9[96374]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:06 compute-1 sudo[96372]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:07 compute-1 sudo[96524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sikylkixfsapcsvfgoebqykvqpkcilnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718386.772086-49-60687774468562/AnsiballZ_file.py'
Dec 02 23:33:07 compute-1 sudo[96524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:07 compute-1 python3.9[96526]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:07 compute-1 sudo[96524]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:07 compute-1 sudo[96676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoftmambvdewpnlrarnvoqdqamkfnisw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718387.4708843-49-187345843758426/AnsiballZ_file.py'
Dec 02 23:33:07 compute-1 sudo[96676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:07 compute-1 python3.9[96678]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:08 compute-1 sudo[96676]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:08 compute-1 sudo[96828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsgiapjtetiwyepeesrahplmdppcyrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718388.2234905-49-7529007844030/AnsiballZ_file.py'
Dec 02 23:33:08 compute-1 sudo[96828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:08 compute-1 python3.9[96830]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:08 compute-1 sudo[96828]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:09 compute-1 sudo[96980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daskxgaqsukgdvtjnbhdxudtopcjoisr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718388.9889696-49-165740184201153/AnsiballZ_file.py'
Dec 02 23:33:09 compute-1 sudo[96980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:09 compute-1 python3.9[96982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:09 compute-1 sudo[96980]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:10 compute-1 python3.9[97132]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:33:11 compute-1 sudo[97282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atlbwiayvvmuthqrfmbbiteuocsralxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718390.6639886-137-49507956364716/AnsiballZ_seboolean.py'
Dec 02 23:33:11 compute-1 sudo[97282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:11 compute-1 python3.9[97284]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 23:33:11 compute-1 sudo[97282]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:12 compute-1 python3.9[97434]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:13 compute-1 python3.9[97555]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718392.1892192-153-168616825949416/.source follow=False _original_basename=haproxy.j2 checksum=66fe13ac5fc047d8fb3860998b97ca468880e317 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:14 compute-1 python3.9[97705]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:14 compute-1 python3.9[97826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718393.818008-183-183725265256915/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:15 compute-1 sudo[97977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yarssquqmqaoyypquvbcplweegoippsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718395.448267-217-163138658325886/AnsiballZ_setup.py'
Dec 02 23:33:15 compute-1 sudo[97977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:16 compute-1 python3.9[97979]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:33:16 compute-1 sudo[97977]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:16 compute-1 sudo[98061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jglcmjinsapjgubzcwcmqmemnytagtlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718395.448267-217-163138658325886/AnsiballZ_dnf.py'
Dec 02 23:33:16 compute-1 sudo[98061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:17 compute-1 python3.9[98063]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:33:18 compute-1 sudo[98061]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:19 compute-1 sudo[98214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epsdgxwrpymugmbqbjvidigdlhryapsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718398.51515-241-25775725351532/AnsiballZ_systemd.py'
Dec 02 23:33:19 compute-1 sudo[98214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:19 compute-1 python3.9[98216]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:33:20 compute-1 sudo[98214]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:21 compute-1 python3.9[98369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:22 compute-1 python3.9[98490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718401.0113623-257-170444174814645/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:22 compute-1 python3.9[98640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:23 compute-1 python3.9[98761]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718402.3920684-257-17508969998282/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:24 compute-1 python3.9[98911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:25 compute-1 ovn_controller[95464]: 2025-12-02T23:33:25Z|00037|memory|INFO|16072 kB peak resident set size after 29.6 seconds
Dec 02 23:33:25 compute-1 ovn_controller[95464]: 2025-12-02T23:33:25Z|00038|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec 02 23:33:25 compute-1 podman[99006]: 2025-12-02 23:33:25.552376188 +0000 UTC m=+0.144368128 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_managed=true)
Dec 02 23:33:25 compute-1 python3.9[99043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718404.370856-345-158323655706853/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:26 compute-1 python3.9[99211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:27 compute-1 python3.9[99332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718405.8870616-345-62577582270529/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:28 compute-1 python3.9[99482]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:33:28 compute-1 sudo[99634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkfbtbsoztiyekqycqbxofzzcrxvaxys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718408.4043756-421-136687504466648/AnsiballZ_file.py'
Dec 02 23:33:28 compute-1 sudo[99634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:29 compute-1 python3.9[99636]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:29 compute-1 sudo[99634]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:29 compute-1 sudo[99786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlplvfwrcjexixqmrztxcvjufwcdhuku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718409.3500578-437-245708728035266/AnsiballZ_stat.py'
Dec 02 23:33:29 compute-1 sudo[99786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:29 compute-1 python3.9[99788]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:30 compute-1 sudo[99786]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:30 compute-1 sudo[99864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seshrtywwxvcivtcyvojzlpvdxcajcaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718409.3500578-437-245708728035266/AnsiballZ_file.py'
Dec 02 23:33:30 compute-1 sudo[99864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:30 compute-1 python3.9[99866]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:30 compute-1 sudo[99864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:31 compute-1 sudo[100016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzgumnnrjmzyphzumzqpjdqpgvqognnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718410.7689364-437-89606158829289/AnsiballZ_stat.py'
Dec 02 23:33:31 compute-1 sudo[100016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:31 compute-1 python3.9[100018]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:31 compute-1 sudo[100016]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:31 compute-1 sudo[100094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxcfjbkaibixvnsiilclivigcbcqbybd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718410.7689364-437-89606158829289/AnsiballZ_file.py'
Dec 02 23:33:31 compute-1 sudo[100094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:31 compute-1 python3.9[100096]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:32 compute-1 sudo[100094]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:32 compute-1 sudo[100246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsgbxftyrqwjsealkiejztnvjhepvqjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718412.2409608-483-104937242085450/AnsiballZ_file.py'
Dec 02 23:33:32 compute-1 sudo[100246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:32 compute-1 python3.9[100248]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:32 compute-1 sudo[100246]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:33 compute-1 sudo[100398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vztsrxogjitihifckqjjksqcnpszhxfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718413.0974004-499-221834685848981/AnsiballZ_stat.py'
Dec 02 23:33:33 compute-1 sudo[100398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:33 compute-1 python3.9[100400]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:33 compute-1 sudo[100398]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:34 compute-1 sudo[100476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoowxczuplwmefzrihuugnplfrokyilr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718413.0974004-499-221834685848981/AnsiballZ_file.py'
Dec 02 23:33:34 compute-1 sudo[100476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:34 compute-1 python3.9[100478]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:34 compute-1 sudo[100476]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:34 compute-1 sudo[100628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lftxrcdxzyqwvhcrxudopunmpmuxdrod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718414.519035-523-206799421643010/AnsiballZ_stat.py'
Dec 02 23:33:34 compute-1 sudo[100628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:35 compute-1 python3.9[100630]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:35 compute-1 sudo[100628]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:35 compute-1 sudo[100706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-limgvfbktsfgpgacbmpqpcvbamtrbvds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718414.519035-523-206799421643010/AnsiballZ_file.py'
Dec 02 23:33:35 compute-1 sudo[100706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:35 compute-1 python3.9[100708]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:35 compute-1 sudo[100706]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:36 compute-1 sudo[100858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhdtiptzumqmsaszbkcqhsnqqspkdzce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718415.9204385-547-258521884125848/AnsiballZ_systemd.py'
Dec 02 23:33:36 compute-1 sudo[100858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:36 compute-1 python3.9[100860]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:33:36 compute-1 systemd[1]: Reloading.
Dec 02 23:33:36 compute-1 systemd-rc-local-generator[100888]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:36 compute-1 systemd-sysv-generator[100891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:37 compute-1 sudo[100858]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:37 compute-1 sudo[101047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcqqahgsgdbujityuczfsmnawxksazfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718417.2866094-563-62082279438554/AnsiballZ_stat.py'
Dec 02 23:33:37 compute-1 sudo[101047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:37 compute-1 python3.9[101049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:37 compute-1 sudo[101047]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:38 compute-1 sudo[101125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-howadkpinzzechbtteejbrzwnijzzwez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718417.2866094-563-62082279438554/AnsiballZ_file.py'
Dec 02 23:33:38 compute-1 sudo[101125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:38 compute-1 python3.9[101127]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:38 compute-1 sudo[101125]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:39 compute-1 sudo[101277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvadqixmfroqypgmeuixagjbamqeafo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718418.679326-587-246735027607353/AnsiballZ_stat.py'
Dec 02 23:33:39 compute-1 sudo[101277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:39 compute-1 python3.9[101279]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:39 compute-1 sudo[101277]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:39 compute-1 sudo[101355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxqecmuuivsqmttridmeutdssfjfaitm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718418.679326-587-246735027607353/AnsiballZ_file.py'
Dec 02 23:33:39 compute-1 sudo[101355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:39 compute-1 python3.9[101357]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:39 compute-1 sudo[101355]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:40 compute-1 sudo[101507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryfazalwgoztbpfphvsycmfjmunhdlzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718420.1435678-611-163278665501098/AnsiballZ_systemd.py'
Dec 02 23:33:40 compute-1 sudo[101507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:40 compute-1 python3.9[101509]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:33:40 compute-1 systemd[1]: Reloading.
Dec 02 23:33:40 compute-1 systemd-rc-local-generator[101538]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:40 compute-1 systemd-sysv-generator[101543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:41 compute-1 systemd[1]: Starting Create netns directory...
Dec 02 23:33:41 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:33:41 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:33:41 compute-1 systemd[1]: Finished Create netns directory.
Dec 02 23:33:41 compute-1 sudo[101507]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:42 compute-1 sudo[101700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyyiqhlambkamsrbyautrmiuxyhbgsjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718421.5886047-631-166089506090395/AnsiballZ_file.py'
Dec 02 23:33:42 compute-1 sudo[101700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:42 compute-1 python3.9[101702]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:42 compute-1 sudo[101700]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:42 compute-1 sudo[101852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwllkmhhlgokbvmfcnxgqmmuaahsqokf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718422.4749-647-177956415396823/AnsiballZ_stat.py'
Dec 02 23:33:42 compute-1 sudo[101852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:43 compute-1 python3.9[101854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:43 compute-1 sudo[101852]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:43 compute-1 sudo[101975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zigxhkcfjhkedayqlbiovklzjryawnpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718422.4749-647-177956415396823/AnsiballZ_copy.py'
Dec 02 23:33:43 compute-1 sudo[101975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:43 compute-1 python3.9[101977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718422.4749-647-177956415396823/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:43 compute-1 sudo[101975]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:44 compute-1 sudo[102127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnkkmqymjfolpopvhawocswjnjxmnugg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718424.1349502-681-185009522473472/AnsiballZ_file.py'
Dec 02 23:33:44 compute-1 sudo[102127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:44 compute-1 python3.9[102129]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:44 compute-1 sudo[102127]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:45 compute-1 sudo[102279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccavosymorkzvyvgsranjkbrwrjvsgkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718424.975993-697-108442977088923/AnsiballZ_stat.py'
Dec 02 23:33:45 compute-1 sudo[102279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:45 compute-1 python3.9[102281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:45 compute-1 sudo[102279]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:45 compute-1 sudo[102402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzajrtxlawdkxhqbbijqtzmsglaeluoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718424.975993-697-108442977088923/AnsiballZ_copy.py'
Dec 02 23:33:45 compute-1 sudo[102402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:46 compute-1 python3.9[102404]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718424.975993-697-108442977088923/.source.json _original_basename=.n5v5vz5e follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:46 compute-1 sudo[102402]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:46 compute-1 sudo[102554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpkvodhvjwjeprmlhqnrfspugqkotnce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718426.4731941-727-193789462365416/AnsiballZ_file.py'
Dec 02 23:33:46 compute-1 sudo[102554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:47 compute-1 python3.9[102556]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:47 compute-1 sudo[102554]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:47 compute-1 sudo[102706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prgdnuwbulrcqjluniniagltjjidijoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718427.3998704-743-55062482316927/AnsiballZ_stat.py'
Dec 02 23:33:47 compute-1 sudo[102706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:48 compute-1 sudo[102706]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:48 compute-1 sudo[102829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsqvfzfogsjecatrywxcxazhlabsprqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718427.3998704-743-55062482316927/AnsiballZ_copy.py'
Dec 02 23:33:48 compute-1 sudo[102829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:48 compute-1 sudo[102829]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:49 compute-1 sudo[102981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acriotbyudwauzcpbtzckkcokrocgloe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718429.0990732-777-216991364683321/AnsiballZ_container_config_data.py'
Dec 02 23:33:49 compute-1 sudo[102981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:49 compute-1 python3.9[102983]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 02 23:33:49 compute-1 sudo[102981]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:50 compute-1 sudo[103133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nidtyedvhpmoiznjwgugmeyyrecppnvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718430.1964147-795-119027657086078/AnsiballZ_container_config_hash.py'
Dec 02 23:33:50 compute-1 sudo[103133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:50 compute-1 python3.9[103135]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:33:50 compute-1 sudo[103133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:51 compute-1 sudo[103285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plfjokwxrjrfkwpkxizrcqspavfqpjah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718431.319314-813-276076449176731/AnsiballZ_podman_container_info.py'
Dec 02 23:33:51 compute-1 sudo[103285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:52 compute-1 python3.9[103287]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 23:33:52 compute-1 sudo[103285]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:53 compute-1 sudo[103463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzbkzmnoaxpshbozmqycivgsenpberwx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718432.965441-839-257713346333910/AnsiballZ_edpm_container_manage.py'
Dec 02 23:33:53 compute-1 sudo[103463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:53 compute-1 python3[103465]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:33:53 compute-1 podman[103502]: 2025-12-02 23:33:53.90454963 +0000 UTC m=+0.053164404 container create 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent)
Dec 02 23:33:53 compute-1 podman[103502]: 2025-12-02 23:33:53.879086845 +0000 UTC m=+0.027701629 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:33:53 compute-1 python3[103465]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:33:54 compute-1 sudo[103463]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:54 compute-1 sudo[103691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opxaprchgbdslhlilbyqgaleylxttwqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718434.2467394-855-225925190719663/AnsiballZ_stat.py'
Dec 02 23:33:54 compute-1 sudo[103691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:54 compute-1 python3.9[103693]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:33:54 compute-1 sudo[103691]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:55 compute-1 sudo[103855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjowzqizmdvbzycvdlefwdfkzrnltwrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718435.2987063-873-142738495110423/AnsiballZ_file.py'
Dec 02 23:33:55 compute-1 sudo[103855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:55 compute-1 podman[103819]: 2025-12-02 23:33:55.865174891 +0000 UTC m=+0.157163035 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:33:55 compute-1 python3.9[103862]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:56 compute-1 sudo[103855]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:56 compute-1 sudo[103946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tffvkuuvqtuzplgvqmtptdkuddzjwnmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718435.2987063-873-142738495110423/AnsiballZ_stat.py'
Dec 02 23:33:56 compute-1 sudo[103946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:56 compute-1 python3.9[103948]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:33:56 compute-1 sudo[103946]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:57 compute-1 sudo[104097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucjvafazqynpoacbjnkalwaokzywrkpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718436.60791-873-92865446614104/AnsiballZ_copy.py'
Dec 02 23:33:57 compute-1 sudo[104097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:57 compute-1 python3.9[104099]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718436.60791-873-92865446614104/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:57 compute-1 sudo[104097]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:57 compute-1 sudo[104173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fibpngwyuecrtejdnncnphdcaafvbiwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718436.60791-873-92865446614104/AnsiballZ_systemd.py'
Dec 02 23:33:57 compute-1 sudo[104173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:58 compute-1 python3.9[104175]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:33:58 compute-1 systemd[1]: Reloading.
Dec 02 23:33:58 compute-1 systemd-sysv-generator[104206]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:58 compute-1 systemd-rc-local-generator[104202]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:58 compute-1 sudo[104173]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:58 compute-1 sudo[104284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrwwxvhrhaoekmgqmustzgajacufovtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718436.60791-873-92865446614104/AnsiballZ_systemd.py'
Dec 02 23:33:58 compute-1 sudo[104284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:59 compute-1 python3.9[104286]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:33:59 compute-1 systemd[1]: Reloading.
Dec 02 23:33:59 compute-1 systemd-rc-local-generator[104312]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:59 compute-1 systemd-sysv-generator[104319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:59 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Dec 02 23:33:59 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:33:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ea6a9dfe2eb8884dba6da65709fa9754238a2bec4ec82753c469f9c62a7f87/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 23:33:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ea6a9dfe2eb8884dba6da65709fa9754238a2bec4ec82753c469f9c62a7f87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:33:59 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633.
Dec 02 23:33:59 compute-1 podman[104327]: 2025-12-02 23:33:59.681440498 +0000 UTC m=+0.184900893 container init 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + sudo -E kolla_set_configs
Dec 02 23:33:59 compute-1 podman[104327]: 2025-12-02 23:33:59.717636879 +0000 UTC m=+0.221097234 container start 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:33:59 compute-1 edpm-start-podman-container[104327]: ovn_metadata_agent
Dec 02 23:33:59 compute-1 edpm-start-podman-container[104326]: Creating additional drop-in dependency for "ovn_metadata_agent" (7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633)
Dec 02 23:33:59 compute-1 systemd[1]: Reloading.
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Validating config file
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Copying service configuration files
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Writing out command to execute
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: ++ cat /run_command
Dec 02 23:33:59 compute-1 podman[104350]: 2025-12-02 23:33:59.824709353 +0000 UTC m=+0.083874454 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + CMD=neutron-ovn-metadata-agent
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + ARGS=
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + sudo kolla_copy_cacerts
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + [[ ! -n '' ]]
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + . kolla_extend_start
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: Running command: 'neutron-ovn-metadata-agent'
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + umask 0022
Dec 02 23:33:59 compute-1 ovn_metadata_agent[104343]: + exec neutron-ovn-metadata-agent
Dec 02 23:33:59 compute-1 systemd-rc-local-generator[104419]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:59 compute-1 systemd-sysv-generator[104422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:34:00 compute-1 systemd[1]: Started ovn_metadata_agent container.
Dec 02 23:34:00 compute-1 sudo[104284]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:00 compute-1 sshd-session[96065]: Connection closed by 192.168.122.30 port 52380
Dec 02 23:34:00 compute-1 sshd-session[96062]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:34:00 compute-1 systemd[1]: session-22.scope: Deactivated successfully.
Dec 02 23:34:00 compute-1 systemd[1]: session-22.scope: Consumed 42.275s CPU time.
Dec 02 23:34:00 compute-1 systemd-logind[790]: Session 22 logged out. Waiting for processes to exit.
Dec 02 23:34:00 compute-1 systemd-logind[790]: Removed session 22.
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.625 104348 INFO neutron.common.config [-] Logging enabled!
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.625 104348 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.626 104348 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.627 104348 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.628 104348 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.629 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.630 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.74 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.631 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.632 104348 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.633 104348 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.634 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.635 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.636 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.637 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.638 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.639 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.640 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.641 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.642 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.643 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.644 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.645 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.646 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.647 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.648 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.649 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.650 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.651 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.652 104348 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.691 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.691 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.691 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.692 104348 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.692 104348 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.702 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e895a64d-10b7-4a6e-a7ff-0745f1562623 (UUID: e895a64d-10b7-4a6e-a7ff-0745f1562623) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.727 104348 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.727 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.727 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.728 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.728 104348 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.730 104348 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.735 104348 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.742 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e895a64d-10b7-4a6e-a7ff-0745f1562623'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], external_ids={}, name=e895a64d-10b7-4a6e-a7ff-0745f1562623, nb_cfg_timestamp=1764718384950, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:01.744 104348 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp217poatx/privsep.sock']
Dec 02 23:34:02 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.526 104348 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.527 104348 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp217poatx/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.380 104464 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.384 104464 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.386 104464 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.386 104464 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104464
Dec 02 23:34:02 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:02.530 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb53e85-07d7-48c2-bd3f-2ab7d0949fa0]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.018 104464 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.018 104464 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.018 104464 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.522 104464 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.527 104464 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.566 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5f240d-fba9-4678-b295-393fb3d4f4dd]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.568 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, column=external_ids, values=({'neutron:ovn-metadata-id': '16aa5006-4920-5349-8b66-26e1a663b3f0'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.575 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:34:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:34:03.580 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:34:05 compute-1 sshd-session[104469]: Accepted publickey for zuul from 192.168.122.30 port 57342 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:34:05 compute-1 systemd-logind[790]: New session 23 of user zuul.
Dec 02 23:34:05 compute-1 systemd[1]: Started Session 23 of User zuul.
Dec 02 23:34:05 compute-1 sshd-session[104469]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:34:07 compute-1 python3.9[104622]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:34:08 compute-1 sudo[104776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btrmzcquhmtyvlbxazzqmantvhsmhufm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718447.6609983-49-226049952735845/AnsiballZ_command.py'
Dec 02 23:34:08 compute-1 sudo[104776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:08 compute-1 python3.9[104778]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:08 compute-1 sudo[104776]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:09 compute-1 sudo[104940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqkcoevvpqtznckuqndulkzswjhlzuaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718448.8378034-71-171280113325665/AnsiballZ_systemd_service.py'
Dec 02 23:34:09 compute-1 sudo[104940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:09 compute-1 python3.9[104942]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:34:09 compute-1 systemd[1]: Reloading.
Dec 02 23:34:09 compute-1 systemd-rc-local-generator[104968]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:34:09 compute-1 systemd-sysv-generator[104971]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:34:10 compute-1 sudo[104940]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:10 compute-1 python3.9[105127]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:34:11 compute-1 network[105144]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:34:11 compute-1 network[105145]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:34:11 compute-1 network[105146]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:34:16 compute-1 sudo[105405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvjwaxfzchppjmyfmpnrsjstzjazhdza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718456.293177-109-270964497158289/AnsiballZ_systemd_service.py'
Dec 02 23:34:16 compute-1 sudo[105405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:17 compute-1 python3.9[105407]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:17 compute-1 sudo[105405]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:17 compute-1 sudo[105558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orpqmhpiaghmlsyaypjwdlnnomeeioga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718457.2239373-109-8558196211857/AnsiballZ_systemd_service.py'
Dec 02 23:34:17 compute-1 sudo[105558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:17 compute-1 python3.9[105560]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:17 compute-1 sudo[105558]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:18 compute-1 sudo[105711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmpvtvqiswrzkkthkwgmxqughwxnjvra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718458.0059404-109-76804260034495/AnsiballZ_systemd_service.py'
Dec 02 23:34:18 compute-1 sudo[105711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:18 compute-1 python3.9[105713]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:18 compute-1 sudo[105711]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:19 compute-1 sudo[105864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eslbpfeaynewwzefjbxkqjrdnfhmbcdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718458.8268018-109-214367707901562/AnsiballZ_systemd_service.py'
Dec 02 23:34:19 compute-1 sudo[105864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:19 compute-1 python3.9[105866]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:19 compute-1 sudo[105864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:19 compute-1 sudo[106017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxtwfgwathnotypeqsqicrcyzyldganu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718459.6606166-109-233523644859431/AnsiballZ_systemd_service.py'
Dec 02 23:34:19 compute-1 sudo[106017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:20 compute-1 python3.9[106019]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:20 compute-1 sudo[106017]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:20 compute-1 sudo[106170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znacbpzulvskxvurjdiarfqbexquuxjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718460.4508364-109-99532878532692/AnsiballZ_systemd_service.py'
Dec 02 23:34:20 compute-1 sudo[106170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:21 compute-1 python3.9[106172]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:21 compute-1 sudo[106170]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:21 compute-1 sudo[106323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdonnsksecgndqjcoykfzewxhxzbjisx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718461.3321607-109-228422590199428/AnsiballZ_systemd_service.py'
Dec 02 23:34:21 compute-1 sudo[106323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:21 compute-1 python3.9[106325]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:21 compute-1 sudo[106323]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:22 compute-1 sudo[106476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvvtgerubavqohtaiyznmsklbtirxpsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718462.3892186-213-164060057353394/AnsiballZ_file.py'
Dec 02 23:34:22 compute-1 sudo[106476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:23 compute-1 python3.9[106478]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:23 compute-1 sudo[106476]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:23 compute-1 sudo[106628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hitfhfsrbdbvpsrfmpwikxqxgtmewiyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718463.308189-213-159848392486836/AnsiballZ_file.py'
Dec 02 23:34:23 compute-1 sudo[106628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:23 compute-1 python3.9[106630]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:23 compute-1 sudo[106628]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:24 compute-1 sudo[106780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnroebxbcxoopqyopujevscksqdyycnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718464.0270133-213-30614716605129/AnsiballZ_file.py'
Dec 02 23:34:24 compute-1 sudo[106780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:24 compute-1 python3.9[106782]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:24 compute-1 sudo[106780]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:25 compute-1 sudo[106932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juwbovtwlaemcwvfiifrwgdckymswxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718464.6754577-213-234092937071230/AnsiballZ_file.py'
Dec 02 23:34:25 compute-1 sudo[106932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:25 compute-1 python3.9[106934]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:25 compute-1 sudo[106932]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:25 compute-1 sudo[107084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntyionzwarheubyboajlyorgoaflmoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718465.3862827-213-231680669027614/AnsiballZ_file.py'
Dec 02 23:34:25 compute-1 sudo[107084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:25 compute-1 python3.9[107086]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:25 compute-1 sudo[107084]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:26 compute-1 podman[107132]: 2025-12-02 23:34:26.30900004 +0000 UTC m=+0.121154639 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:34:26 compute-1 sudo[107263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxyfkjxmsnxtqxksxwrdfnqxqdoowwqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718466.152738-213-270036162605914/AnsiballZ_file.py'
Dec 02 23:34:26 compute-1 sudo[107263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:26 compute-1 python3.9[107265]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:26 compute-1 sudo[107263]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:27 compute-1 sudo[107415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nslmodjjsnkenfqekzaigzbodpjslist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718466.876419-213-128569488053672/AnsiballZ_file.py'
Dec 02 23:34:27 compute-1 sudo[107415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:27 compute-1 python3.9[107417]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:27 compute-1 sudo[107415]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:28 compute-1 sudo[107567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzwfltxbtsvzlrnkzedfvyhqtnpuogik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718467.7064347-313-272047986275266/AnsiballZ_file.py'
Dec 02 23:34:28 compute-1 sudo[107567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:28 compute-1 python3.9[107569]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:28 compute-1 sudo[107567]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:28 compute-1 sudo[107719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhkhttvfyfnsdsddwyroxdyczlhqurap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718468.5108871-313-82861379861233/AnsiballZ_file.py'
Dec 02 23:34:28 compute-1 sudo[107719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:29 compute-1 python3.9[107721]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:29 compute-1 sudo[107719]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:29 compute-1 sudo[107871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycitmmyblcrwwbxnodkzywmdxldmgnpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718469.29304-313-108023386092194/AnsiballZ_file.py'
Dec 02 23:34:29 compute-1 sudo[107871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:29 compute-1 python3.9[107873]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:29 compute-1 sudo[107871]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:30 compute-1 sudo[108040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoasliihdpsltqiioybatpbrdpgnpecm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718469.915104-313-265650735496958/AnsiballZ_file.py'
Dec 02 23:34:30 compute-1 sudo[108040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:30 compute-1 podman[107981]: 2025-12-02 23:34:30.277651688 +0000 UTC m=+0.097831795 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:34:30 compute-1 python3.9[108044]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:30 compute-1 sudo[108040]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:30 compute-1 sudo[108194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzejmrtkfdfjwiiehigaewktvsgihxlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718470.6049912-313-203470002499874/AnsiballZ_file.py'
Dec 02 23:34:30 compute-1 sudo[108194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:31 compute-1 python3.9[108196]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:31 compute-1 sudo[108194]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:31 compute-1 sudo[108346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmkzkabkftlwnctmpwapuwehzvmsimob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718471.3399785-313-81182386301973/AnsiballZ_file.py'
Dec 02 23:34:31 compute-1 sudo[108346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:31 compute-1 python3.9[108348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:31 compute-1 sudo[108346]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:32 compute-1 sudo[108498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkejtknarhwnkpxohbavvyhuqdgagliz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718472.1363904-313-177051298637632/AnsiballZ_file.py'
Dec 02 23:34:32 compute-1 sudo[108498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:32 compute-1 python3.9[108500]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:32 compute-1 sudo[108498]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:33 compute-1 sudo[108650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkukwafvtwxlelvslpvlxrozehtydmua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718473.0797555-415-137068054761429/AnsiballZ_command.py'
Dec 02 23:34:33 compute-1 sudo[108650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:33 compute-1 python3.9[108652]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:33 compute-1 sudo[108650]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:34 compute-1 python3.9[108804]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:34:35 compute-1 sudo[108954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylwtwmddrtcquqxypiueeuyxxmoxsetv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718474.924066-451-232120894315023/AnsiballZ_systemd_service.py'
Dec 02 23:34:35 compute-1 sudo[108954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:35 compute-1 python3.9[108956]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:34:35 compute-1 systemd[1]: Reloading.
Dec 02 23:34:35 compute-1 systemd-rc-local-generator[108983]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:34:35 compute-1 systemd-sysv-generator[108989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:34:35 compute-1 sudo[108954]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:36 compute-1 sudo[109142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdqraxrbbcdjupwcghlhzrvizhjhqecc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718476.1569319-467-101302349503775/AnsiballZ_command.py'
Dec 02 23:34:36 compute-1 sudo[109142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:36 compute-1 python3.9[109144]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:36 compute-1 sudo[109142]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:37 compute-1 sudo[109295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xasgwxrjjbyxbiehbonenfvstdwxehfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718477.0020995-467-38669800827451/AnsiballZ_command.py'
Dec 02 23:34:37 compute-1 sudo[109295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:37 compute-1 python3.9[109297]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:37 compute-1 sudo[109295]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:38 compute-1 sudo[109448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qztxierqzqnphsnchqztmwchqdwvdosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718477.77761-467-196706429985399/AnsiballZ_command.py'
Dec 02 23:34:38 compute-1 sudo[109448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:38 compute-1 python3.9[109450]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:38 compute-1 sudo[109448]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:38 compute-1 sudo[109601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzxjbvgjphogesunowhjumytymzbpvfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718478.6146233-467-266992269603901/AnsiballZ_command.py'
Dec 02 23:34:38 compute-1 sudo[109601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:39 compute-1 python3.9[109603]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:39 compute-1 sudo[109601]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:39 compute-1 sudo[109754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfyzofqgsasymfhyxvbjkavzzzperyiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718479.406039-467-20257449228443/AnsiballZ_command.py'
Dec 02 23:34:39 compute-1 sudo[109754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:39 compute-1 python3.9[109756]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:39 compute-1 sudo[109754]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:40 compute-1 sudo[109907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfpsvqvgoyxrqnxbrfkpkojdmyuudtum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718480.1331067-467-171195782668218/AnsiballZ_command.py'
Dec 02 23:34:40 compute-1 sudo[109907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:40 compute-1 python3.9[109909]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:40 compute-1 sudo[109907]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:41 compute-1 sudo[110060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywwipsydeeiokwtpwxggpxerfimjlruu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718480.9842389-467-187235077922302/AnsiballZ_command.py'
Dec 02 23:34:41 compute-1 sudo[110060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:41 compute-1 python3.9[110062]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:41 compute-1 sudo[110060]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:42 compute-1 sudo[110213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpevgscsdtcfzvsqpebdainjqciaxjyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718482.0777318-575-205267348413781/AnsiballZ_getent.py'
Dec 02 23:34:42 compute-1 sudo[110213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:42 compute-1 python3.9[110215]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 02 23:34:42 compute-1 sudo[110213]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:43 compute-1 sudo[110366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbdkekevcgszxcaycapjsawbzuwlxruv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718483.1434069-591-243899421646546/AnsiballZ_group.py'
Dec 02 23:34:43 compute-1 sudo[110366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:43 compute-1 python3.9[110368]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:34:43 compute-1 groupadd[110369]: group added to /etc/group: name=libvirt, GID=42473
Dec 02 23:34:43 compute-1 groupadd[110369]: group added to /etc/gshadow: name=libvirt
Dec 02 23:34:43 compute-1 groupadd[110369]: new group: name=libvirt, GID=42473
Dec 02 23:34:43 compute-1 sudo[110366]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:44 compute-1 sudo[110524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllepzdmllzanwkpystctvodvpnncqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718484.180125-607-38735226071860/AnsiballZ_user.py'
Dec 02 23:34:44 compute-1 sudo[110524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:44 compute-1 python3.9[110526]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:34:44 compute-1 useradd[110528]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:34:45 compute-1 sudo[110524]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:45 compute-1 sudo[110685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjrevqbczqavxefqbgcyuiqmpgqhrwro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718485.500889-629-215542198207498/AnsiballZ_setup.py'
Dec 02 23:34:45 compute-1 sudo[110685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:46 compute-1 python3.9[110687]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:34:46 compute-1 sudo[110685]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:46 compute-1 sudo[110769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdzvfmogafqaujqlkyenfeckfcfkrbbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718485.500889-629-215542198207498/AnsiballZ_dnf.py'
Dec 02 23:34:46 compute-1 sudo[110769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:47 compute-1 python3.9[110771]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:34:57 compute-1 podman[110856]: 2025-12-02 23:34:57.291492532 +0000 UTC m=+0.118475195 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 02 23:35:01 compute-1 podman[110980]: 2025-12-02 23:35:01.205255074 +0000 UTC m=+0.049938290 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true)
Dec 02 23:35:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:35:01.654 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:35:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:35:01.654 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:35:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:35:01.654 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:35:14 compute-1 kernel: SELinux:  Converting 2759 SID table entries...
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:35:14 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:35:25 compute-1 kernel: SELinux:  Converting 2759 SID table entries...
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:35:25 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:35:28 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 02 23:35:28 compute-1 podman[111024]: 2025-12-02 23:35:28.3356044 +0000 UTC m=+0.140473967 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:35:32 compute-1 podman[111050]: 2025-12-02 23:35:32.235324208 +0000 UTC m=+0.066381089 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:35:59 compute-1 podman[122220]: 2025-12-02 23:35:59.247485398 +0000 UTC m=+0.088580700 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:36:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:36:01.655 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:36:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:36:01.655 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:36:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:36:01.656 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:36:03 compute-1 podman[124306]: 2025-12-02 23:36:03.25402827 +0000 UTC m=+0.088236573 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:36:23 compute-1 kernel: SELinux:  Converting 2760 SID table entries...
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:36:23 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:36:24 compute-1 groupadd[127933]: group added to /etc/group: name=dnsmasq, GID=992
Dec 02 23:36:24 compute-1 groupadd[127933]: group added to /etc/gshadow: name=dnsmasq
Dec 02 23:36:24 compute-1 groupadd[127933]: new group: name=dnsmasq, GID=992
Dec 02 23:36:24 compute-1 useradd[127940]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 02 23:36:24 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:36:24 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 02 23:36:24 compute-1 dbus-broker-launch[769]: Noticed file-system modification, trigger reload.
Dec 02 23:36:25 compute-1 groupadd[127953]: group added to /etc/group: name=clevis, GID=991
Dec 02 23:36:25 compute-1 groupadd[127953]: group added to /etc/gshadow: name=clevis
Dec 02 23:36:25 compute-1 groupadd[127953]: new group: name=clevis, GID=991
Dec 02 23:36:25 compute-1 useradd[127960]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 02 23:36:26 compute-1 usermod[127970]: add 'clevis' to group 'tss'
Dec 02 23:36:26 compute-1 usermod[127970]: add 'clevis' to shadow group 'tss'
Dec 02 23:36:28 compute-1 polkitd[43606]: Reloading rules
Dec 02 23:36:28 compute-1 polkitd[43606]: Collecting garbage unconditionally...
Dec 02 23:36:28 compute-1 polkitd[43606]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 23:36:28 compute-1 polkitd[43606]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 23:36:28 compute-1 polkitd[43606]: Finished loading, compiling and executing 3 rules
Dec 02 23:36:28 compute-1 polkitd[43606]: Reloading rules
Dec 02 23:36:28 compute-1 polkitd[43606]: Collecting garbage unconditionally...
Dec 02 23:36:28 compute-1 polkitd[43606]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 23:36:28 compute-1 polkitd[43606]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 23:36:28 compute-1 polkitd[43606]: Finished loading, compiling and executing 3 rules
Dec 02 23:36:29 compute-1 groupadd[128157]: group added to /etc/group: name=ceph, GID=167
Dec 02 23:36:29 compute-1 groupadd[128157]: group added to /etc/gshadow: name=ceph
Dec 02 23:36:29 compute-1 groupadd[128157]: new group: name=ceph, GID=167
Dec 02 23:36:29 compute-1 useradd[128169]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 02 23:36:30 compute-1 podman[128158]: 2025-12-02 23:36:30.044846048 +0000 UTC m=+0.112876755 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 02 23:36:32 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Dec 02 23:36:32 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Dec 02 23:36:32 compute-1 sshd[1008]: Received signal 15; terminating.
Dec 02 23:36:32 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Dec 02 23:36:32 compute-1 systemd[1]: sshd.service: Consumed 1.864s CPU time, read 32.0K from disk, written 4.0K to disk.
Dec 02 23:36:32 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Dec 02 23:36:32 compute-1 systemd[1]: Stopping sshd-keygen.target...
Dec 02 23:36:32 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 23:36:32 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 23:36:32 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 23:36:32 compute-1 systemd[1]: Reached target sshd-keygen.target.
Dec 02 23:36:32 compute-1 systemd[1]: Starting OpenSSH server daemon...
Dec 02 23:36:32 compute-1 sshd[128708]: Server listening on 0.0.0.0 port 22.
Dec 02 23:36:32 compute-1 sshd[128708]: Server listening on :: port 22.
Dec 02 23:36:32 compute-1 systemd[1]: Started OpenSSH server daemon.
Dec 02 23:36:33 compute-1 podman[128768]: 2025-12-02 23:36:33.356475332 +0000 UTC m=+0.060378792 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 02 23:36:35 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:36:35 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:36:35 compute-1 systemd[1]: Reloading.
Dec 02 23:36:35 compute-1 systemd-sysv-generator[128987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:35 compute-1 systemd-rc-local-generator[128981]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:35 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:36:39 compute-1 sudo[110769]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:39 compute-1 sudo[133112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufmalqzehdchjglfrekbsnfzliialzwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718599.3126047-653-127712299578446/AnsiballZ_systemd.py'
Dec 02 23:36:39 compute-1 sudo[133112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:40 compute-1 python3.9[133141]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:40 compute-1 systemd[1]: Reloading.
Dec 02 23:36:40 compute-1 systemd-rc-local-generator[133505]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:40 compute-1 systemd-sysv-generator[133509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:40 compute-1 sudo[133112]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:41 compute-1 sudo[134196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwzckocdgmpsmiqbmlidmvhwtlygwoow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718600.7932956-653-116555902781289/AnsiballZ_systemd.py'
Dec 02 23:36:41 compute-1 sudo[134196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:41 compute-1 python3.9[134225]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:41 compute-1 systemd[1]: Reloading.
Dec 02 23:36:41 compute-1 systemd-sysv-generator[134567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:41 compute-1 systemd-rc-local-generator[134562]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:41 compute-1 sudo[134196]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:42 compute-1 sudo[135264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onltfjnggnycfirykxkmwmubnaejyakb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718602.0441408-653-206233596741714/AnsiballZ_systemd.py'
Dec 02 23:36:42 compute-1 sudo[135264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:42 compute-1 python3.9[135285]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:42 compute-1 systemd[1]: Reloading.
Dec 02 23:36:42 compute-1 systemd-rc-local-generator[135594]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:42 compute-1 systemd-sysv-generator[135601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:43 compute-1 sudo[135264]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:43 compute-1 sudo[136255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfpfkzifxajcljycqgcdmjaifmrmfqca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718603.2370903-653-179298383945793/AnsiballZ_systemd.py'
Dec 02 23:36:43 compute-1 sudo[136255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:43 compute-1 python3.9[136277]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:44 compute-1 systemd[1]: Reloading.
Dec 02 23:36:44 compute-1 systemd-rc-local-generator[136591]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:44 compute-1 systemd-sysv-generator[136597]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:44 compute-1 sudo[136255]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:44 compute-1 sudo[137347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgweshimsvvvbphyjttiljjmviuvjarc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718604.5734744-711-242622078855071/AnsiballZ_systemd.py'
Dec 02 23:36:44 compute-1 sudo[137347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:45 compute-1 python3.9[137366]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:45 compute-1 systemd[1]: Reloading.
Dec 02 23:36:45 compute-1 systemd-rc-local-generator[137749]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:45 compute-1 systemd-sysv-generator[137755]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:45 compute-1 sudo[137347]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:46 compute-1 sudo[138357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htpniaxvsgfuxyfawwzejyopnxunopde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718605.8902912-711-52902329568562/AnsiballZ_systemd.py'
Dec 02 23:36:46 compute-1 sudo[138357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:46 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:36:46 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:36:46 compute-1 systemd[1]: man-db-cache-update.service: Consumed 13.943s CPU time.
Dec 02 23:36:46 compute-1 systemd[1]: run-r96de6f23720d43e2ba309e9d5d546c19.service: Deactivated successfully.
Dec 02 23:36:46 compute-1 python3.9[138380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:46 compute-1 systemd[1]: Reloading.
Dec 02 23:36:46 compute-1 systemd-sysv-generator[138503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:46 compute-1 systemd-rc-local-generator[138499]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:46 compute-1 sudo[138357]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:47 compute-1 sudo[138657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbkfupylhoedbygxeohbzrkxpmnvsoob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718607.0981476-711-232206254415772/AnsiballZ_systemd.py'
Dec 02 23:36:47 compute-1 sudo[138657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:47 compute-1 python3.9[138659]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:47 compute-1 systemd[1]: Reloading.
Dec 02 23:36:47 compute-1 systemd-rc-local-generator[138689]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:47 compute-1 systemd-sysv-generator[138693]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:48 compute-1 sudo[138657]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:48 compute-1 sudo[138847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vouwjmcwxqmcwgszglouysgchqqidylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718608.2560296-711-99854725876063/AnsiballZ_systemd.py'
Dec 02 23:36:48 compute-1 sudo[138847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:48 compute-1 python3.9[138849]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:49 compute-1 sudo[138847]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:49 compute-1 sudo[139002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfhaslycciqckkawkakxzgcsyuwimfak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718609.2263637-711-72962536510577/AnsiballZ_systemd.py'
Dec 02 23:36:49 compute-1 sudo[139002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:49 compute-1 python3.9[139004]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:50 compute-1 systemd[1]: Reloading.
Dec 02 23:36:50 compute-1 systemd-rc-local-generator[139036]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:50 compute-1 systemd-sysv-generator[139039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:50 compute-1 sudo[139002]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:50 compute-1 sudo[139193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrljkymnofdqchkdpfbhxlvvfxvprpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718610.628407-783-19373848890620/AnsiballZ_systemd.py'
Dec 02 23:36:50 compute-1 sudo[139193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:51 compute-1 python3.9[139195]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:51 compute-1 systemd[1]: Reloading.
Dec 02 23:36:51 compute-1 systemd-sysv-generator[139226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:51 compute-1 systemd-rc-local-generator[139221]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:51 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 02 23:36:51 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 02 23:36:51 compute-1 sudo[139193]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:53 compute-1 sudo[139386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kttldgzokzvoxxcyaymuzjvxvzyrvzlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718612.8988855-799-265825982585705/AnsiballZ_systemd.py'
Dec 02 23:36:53 compute-1 sudo[139386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:53 compute-1 python3.9[139388]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:54 compute-1 sudo[139386]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:55 compute-1 sudo[139543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltlawcqxwthfwuuacmllabdirsecdxnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718614.8458772-799-192630410613157/AnsiballZ_systemd.py'
Dec 02 23:36:55 compute-1 sudo[139543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:55 compute-1 sshd-session[139390]: Received disconnect from 80.94.93.233 port 32792:11:  [preauth]
Dec 02 23:36:55 compute-1 sshd-session[139390]: Disconnected from authenticating user root 80.94.93.233 port 32792 [preauth]
Dec 02 23:36:55 compute-1 python3.9[139545]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:55 compute-1 sudo[139543]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:56 compute-1 sudo[139698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpsbbjwmhbqqydipmqhwsrkxzkwwvqcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718615.8484952-799-157291355748995/AnsiballZ_systemd.py'
Dec 02 23:36:56 compute-1 sudo[139698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:56 compute-1 python3.9[139700]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:56 compute-1 sudo[139698]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:57 compute-1 sudo[139853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srjfpntmnoezeqwllltikitwlizjswuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718616.7335906-799-39960598064743/AnsiballZ_systemd.py'
Dec 02 23:36:57 compute-1 sudo[139853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:57 compute-1 python3.9[139855]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:57 compute-1 sudo[139853]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:58 compute-1 sudo[140008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagmpphnklknkwffvbecbhzopzvpoqvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718617.6595888-799-54299297902985/AnsiballZ_systemd.py'
Dec 02 23:36:58 compute-1 sudo[140008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:58 compute-1 python3.9[140010]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:58 compute-1 sudo[140008]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:58 compute-1 sudo[140163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikhmnwtnqewtcdqftvxocjszxfnacooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718618.5953672-799-278655665134778/AnsiballZ_systemd.py'
Dec 02 23:36:58 compute-1 sudo[140163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:59 compute-1 python3.9[140165]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:59 compute-1 sudo[140163]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:59 compute-1 sudo[140318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pipxzockiryhgvecplawuomlhbnhhzju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718619.5549538-799-17988807826881/AnsiballZ_systemd.py'
Dec 02 23:36:59 compute-1 sudo[140318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:00 compute-1 python3.9[140320]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:00 compute-1 podman[140321]: 2025-12-02 23:37:00.320882419 +0000 UTC m=+0.148358294 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 02 23:37:00 compute-1 sudo[140318]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:01 compute-1 sudo[140499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syjcuqrbczyjqcwvwocigkxnmqlglncg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718620.617503-799-138156133199760/AnsiballZ_systemd.py'
Dec 02 23:37:01 compute-1 sudo[140499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:01 compute-1 python3.9[140501]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:01 compute-1 sudo[140499]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:37:01.657 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:37:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:37:01.658 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:37:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:37:01.658 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:37:02 compute-1 sudo[140655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqqdcoaouwqxlxdbceqgnnvvbchrkobg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718621.614282-799-156043443716224/AnsiballZ_systemd.py'
Dec 02 23:37:02 compute-1 sudo[140655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:02 compute-1 python3.9[140657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:02 compute-1 sudo[140655]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:02 compute-1 sudo[140810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpfzzhjxjuctqekkyivkleumecgdwbfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718622.6257892-799-129373917246322/AnsiballZ_systemd.py'
Dec 02 23:37:02 compute-1 sudo[140810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:03 compute-1 python3.9[140812]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:03 compute-1 sudo[140810]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:03 compute-1 podman[140816]: 2025-12-02 23:37:03.511849074 +0000 UTC m=+0.093338695 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:37:04 compute-1 sudo[140984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpioazrispkmmzifpesxmnocjadurjno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718623.6653495-799-141382980354869/AnsiballZ_systemd.py'
Dec 02 23:37:04 compute-1 sudo[140984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:04 compute-1 python3.9[140986]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:05 compute-1 sudo[140984]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:06 compute-1 sudo[141139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-metznubjwljvtmljhwfvjpllhpgiqifq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718625.756204-799-167920633751282/AnsiballZ_systemd.py'
Dec 02 23:37:06 compute-1 sudo[141139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:06 compute-1 python3.9[141141]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:06 compute-1 sudo[141139]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:07 compute-1 sudo[141294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aotwoigsesnmtwvrahhakasaegioifsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718626.7869258-799-103849312600868/AnsiballZ_systemd.py'
Dec 02 23:37:07 compute-1 sudo[141294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:07 compute-1 python3.9[141296]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:07 compute-1 sudo[141294]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:08 compute-1 sudo[141449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spbfhbdfwooqjwcvoqntizhhjuilqchk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718627.7130513-799-116184148397983/AnsiballZ_systemd.py'
Dec 02 23:37:08 compute-1 sudo[141449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:08 compute-1 python3.9[141451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:08 compute-1 sudo[141449]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:09 compute-1 sudo[141604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiporjkjjxkwjbccirkkbpnvhuuksmew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718629.5175087-1003-159327450140387/AnsiballZ_file.py'
Dec 02 23:37:09 compute-1 sudo[141604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:10 compute-1 python3.9[141606]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:10 compute-1 sudo[141604]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:10 compute-1 sudo[141756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slrutftebneztxrdiqivbxworoyzbjcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718630.2200837-1003-119617853989041/AnsiballZ_file.py'
Dec 02 23:37:10 compute-1 sudo[141756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:10 compute-1 python3.9[141758]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:10 compute-1 sudo[141756]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:11 compute-1 sudo[141908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipwqudroigrgkfxjemnirxdyhhdhishi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718630.8543007-1003-163095978026090/AnsiballZ_file.py'
Dec 02 23:37:11 compute-1 sudo[141908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:11 compute-1 python3.9[141910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:11 compute-1 sudo[141908]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:11 compute-1 sudo[142060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygbivlkhbfczuvnaxrinomlgjzikwztn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718631.4780176-1003-154482250111446/AnsiballZ_file.py'
Dec 02 23:37:11 compute-1 sudo[142060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:11 compute-1 python3.9[142062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:12 compute-1 sudo[142060]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:12 compute-1 sudo[142212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pylvxhspgnphcjcwkwjmzoxboaolrape ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718632.168909-1003-1202386293576/AnsiballZ_file.py'
Dec 02 23:37:12 compute-1 sudo[142212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:12 compute-1 python3.9[142214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:12 compute-1 sudo[142212]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:13 compute-1 sudo[142364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbyhqmlkwzvifuuqhmfrrkntfuutlyhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718632.891777-1003-195956318590337/AnsiballZ_file.py'
Dec 02 23:37:13 compute-1 sudo[142364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:13 compute-1 python3.9[142366]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:13 compute-1 sudo[142364]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:14 compute-1 sudo[142516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rewiuavtcsdzbjgmbpbapuwghdxmehfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718633.6391478-1089-159545972143247/AnsiballZ_stat.py'
Dec 02 23:37:14 compute-1 sudo[142516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:14 compute-1 python3.9[142518]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:14 compute-1 sudo[142516]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:14 compute-1 sudo[142641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfftscwqzekealvlnqlbcswjivdzorim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718633.6391478-1089-159545972143247/AnsiballZ_copy.py'
Dec 02 23:37:14 compute-1 sudo[142641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:15 compute-1 python3.9[142643]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718633.6391478-1089-159545972143247/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:15 compute-1 sudo[142641]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:15 compute-1 sudo[142793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdysfqterkhdnkftswkbmftkqabqfpgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718635.274066-1089-102893854316935/AnsiballZ_stat.py'
Dec 02 23:37:15 compute-1 sudo[142793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:15 compute-1 python3.9[142795]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:15 compute-1 sudo[142793]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:16 compute-1 sudo[142918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvwblqvdopiilsayayxfzhjszfltkzpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718635.274066-1089-102893854316935/AnsiballZ_copy.py'
Dec 02 23:37:16 compute-1 sudo[142918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:16 compute-1 python3.9[142920]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718635.274066-1089-102893854316935/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:16 compute-1 sudo[142918]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:17 compute-1 sudo[143070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqjnmigmorgopsuvzdheblnnfxwknmjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718636.833402-1089-76579249090119/AnsiballZ_stat.py'
Dec 02 23:37:17 compute-1 sudo[143070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:17 compute-1 python3.9[143072]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:17 compute-1 sudo[143070]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:17 compute-1 sudo[143195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uselbtkurnkhqknerfwkhzvspzyquhmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718636.833402-1089-76579249090119/AnsiballZ_copy.py'
Dec 02 23:37:17 compute-1 sudo[143195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:17 compute-1 python3.9[143197]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718636.833402-1089-76579249090119/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:18 compute-1 sudo[143195]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:18 compute-1 sudo[143347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpxvkhrdwhdxldfcrtemefusypafrqru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718638.1944978-1089-213781380634056/AnsiballZ_stat.py'
Dec 02 23:37:18 compute-1 sudo[143347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:18 compute-1 python3.9[143349]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:18 compute-1 sudo[143347]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:19 compute-1 sudo[143472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krxzgqhqdjmaonavkuwrrmuhlgnceshg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718638.1944978-1089-213781380634056/AnsiballZ_copy.py'
Dec 02 23:37:19 compute-1 sudo[143472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:19 compute-1 python3.9[143474]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718638.1944978-1089-213781380634056/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:19 compute-1 sudo[143472]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:20 compute-1 sudo[143624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiwajxmkuvjiuvpwbbcxinyxfvpilzom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718639.7078214-1089-8581252898319/AnsiballZ_stat.py'
Dec 02 23:37:20 compute-1 sudo[143624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:20 compute-1 python3.9[143626]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:20 compute-1 sudo[143624]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:20 compute-1 sudo[143749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bajdbnsffxwimqjvuwwhmcazsmlfdxol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718639.7078214-1089-8581252898319/AnsiballZ_copy.py'
Dec 02 23:37:20 compute-1 sudo[143749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:20 compute-1 python3.9[143751]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718639.7078214-1089-8581252898319/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:20 compute-1 sudo[143749]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:21 compute-1 sudo[143901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rljmkxqsqmzfdbbkhjuzpilekpppgfew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718641.1494987-1089-97801188251270/AnsiballZ_stat.py'
Dec 02 23:37:21 compute-1 sudo[143901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:21 compute-1 python3.9[143903]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:21 compute-1 sudo[143901]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:22 compute-1 sudo[144026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixnwvwicuuvtrgbrspndpjfuspezeoxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718641.1494987-1089-97801188251270/AnsiballZ_copy.py'
Dec 02 23:37:22 compute-1 sudo[144026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:22 compute-1 python3.9[144028]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718641.1494987-1089-97801188251270/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:22 compute-1 sudo[144026]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:22 compute-1 sudo[144178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqhplmrnrkpzienwsdhubfctplqxvirc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718642.680514-1089-120469302016336/AnsiballZ_stat.py'
Dec 02 23:37:22 compute-1 sudo[144178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:23 compute-1 python3.9[144180]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:23 compute-1 sudo[144178]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:23 compute-1 sudo[144301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqejdnyhzvkfjteehuvtghaikwvbkyaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718642.680514-1089-120469302016336/AnsiballZ_copy.py'
Dec 02 23:37:23 compute-1 sudo[144301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:23 compute-1 python3.9[144303]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718642.680514-1089-120469302016336/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:23 compute-1 sudo[144301]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:24 compute-1 sudo[144453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmwfqnfsabezsoykaavonotbfpkfjbae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718643.9133763-1089-199846313781351/AnsiballZ_stat.py'
Dec 02 23:37:24 compute-1 sudo[144453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:24 compute-1 python3.9[144455]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:24 compute-1 sudo[144453]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:25 compute-1 sudo[144578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sldpzokeeswpgbtrvahfqxcbvahorxbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718643.9133763-1089-199846313781351/AnsiballZ_copy.py'
Dec 02 23:37:25 compute-1 sudo[144578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:25 compute-1 python3.9[144580]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718643.9133763-1089-199846313781351/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:25 compute-1 sudo[144578]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:25 compute-1 sudo[144730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjcqcokzvzsqgyvqslnhumfqysiautku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718645.53398-1315-125601153986728/AnsiballZ_command.py'
Dec 02 23:37:25 compute-1 sudo[144730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:26 compute-1 python3.9[144732]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 02 23:37:26 compute-1 sudo[144730]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:26 compute-1 sudo[144883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkkpulvvcgfamlrrjiigywlvtqyyhziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718646.3781562-1333-159514673994179/AnsiballZ_file.py'
Dec 02 23:37:26 compute-1 sudo[144883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:26 compute-1 python3.9[144885]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:26 compute-1 sudo[144883]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:27 compute-1 sudo[145035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zydvwgjhpgpsbtwwnhopmmqdyedwkamf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718647.132402-1333-123420064726582/AnsiballZ_file.py'
Dec 02 23:37:27 compute-1 sudo[145035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:27 compute-1 python3.9[145037]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:27 compute-1 sudo[145035]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:28 compute-1 sudo[145187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdtlxttekqhmcejnjzmakbmzjujxauvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718647.9053118-1333-275823279332509/AnsiballZ_file.py'
Dec 02 23:37:28 compute-1 sudo[145187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:28 compute-1 python3.9[145189]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:28 compute-1 sudo[145187]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:29 compute-1 sudo[145339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clmtrbawddwotumipefxbzxgslfbbvcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718648.6663501-1333-24453018589709/AnsiballZ_file.py'
Dec 02 23:37:29 compute-1 sudo[145339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:29 compute-1 python3.9[145341]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:29 compute-1 sudo[145339]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:29 compute-1 sudo[145491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjthhsrnwpljquxubbxgzeonfzpcpwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718649.3993702-1333-21236073198321/AnsiballZ_file.py'
Dec 02 23:37:29 compute-1 sudo[145491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:29 compute-1 python3.9[145493]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:30 compute-1 sudo[145491]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:30 compute-1 sudo[145656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzpdineybbusinunfpiaiqmtwxqlcpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718650.188816-1333-87152805132294/AnsiballZ_file.py'
Dec 02 23:37:30 compute-1 sudo[145656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:30 compute-1 podman[145617]: 2025-12-02 23:37:30.575682442 +0000 UTC m=+0.129089008 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Dec 02 23:37:30 compute-1 python3.9[145665]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:30 compute-1 sudo[145656]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:31 compute-1 sudo[145822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stbdyvlnjewjbgayzcbvnncrqbztbwdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718650.8965397-1333-229723728093867/AnsiballZ_file.py'
Dec 02 23:37:31 compute-1 sudo[145822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:31 compute-1 python3.9[145824]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:31 compute-1 sudo[145822]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:31 compute-1 sudo[145974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqqftoxrxvuoslgiwjhalfoazaeecaks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718651.5267851-1333-158916982280118/AnsiballZ_file.py'
Dec 02 23:37:31 compute-1 sudo[145974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:32 compute-1 python3.9[145976]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:32 compute-1 sudo[145974]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:32 compute-1 sudo[146126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uczgaxyvmomcnkhdievwsjuuygxaeyjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718652.2783363-1333-149239111529426/AnsiballZ_file.py'
Dec 02 23:37:32 compute-1 sudo[146126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:32 compute-1 python3.9[146128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:32 compute-1 sudo[146126]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:33 compute-1 sudo[146278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxoqwobtkpddycphyqvabratgmiekift ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718652.9578056-1333-114194306128758/AnsiballZ_file.py'
Dec 02 23:37:33 compute-1 sudo[146278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:33 compute-1 python3.9[146280]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:33 compute-1 sudo[146278]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:34 compute-1 podman[146404]: 2025-12-02 23:37:34.195314441 +0000 UTC m=+0.071291473 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:37:34 compute-1 sudo[146444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msnrlwvrtbxmabwuizeidbossyuumxvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718653.7651422-1333-255715036101989/AnsiballZ_file.py'
Dec 02 23:37:34 compute-1 sudo[146444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:34 compute-1 python3.9[146450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:34 compute-1 sudo[146444]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:35 compute-1 sudo[146600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yupjlnarjtnyzekfedrpbmhjvypfjgks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718654.6173134-1333-251016960218298/AnsiballZ_file.py'
Dec 02 23:37:35 compute-1 sudo[146600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:35 compute-1 python3.9[146602]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:35 compute-1 sudo[146600]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:35 compute-1 sudo[146752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiqhmabnjcnkjsseqtkxqdleyraexmwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718655.3844035-1333-57693653809649/AnsiballZ_file.py'
Dec 02 23:37:35 compute-1 sudo[146752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:35 compute-1 python3.9[146754]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:35 compute-1 sudo[146752]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:36 compute-1 sudo[146904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjuwqyidgfbcpujktkiwtrsrbobqaauz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718656.042294-1333-248217980708532/AnsiballZ_file.py'
Dec 02 23:37:36 compute-1 sudo[146904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:36 compute-1 python3.9[146906]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:36 compute-1 sudo[146904]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:37 compute-1 sudo[147056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cijtrrexplrzsjecpuhbuillxmpugzdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718656.8228045-1531-99454996751943/AnsiballZ_stat.py'
Dec 02 23:37:37 compute-1 sudo[147056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:37 compute-1 python3.9[147058]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:37 compute-1 sudo[147056]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:37 compute-1 sudo[147179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckoysolyuuegnvatfuhdwavgbqwcxidn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718656.8228045-1531-99454996751943/AnsiballZ_copy.py'
Dec 02 23:37:37 compute-1 sudo[147179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:37 compute-1 python3.9[147181]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718656.8228045-1531-99454996751943/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:37 compute-1 sudo[147179]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:38 compute-1 sudo[147331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnsgbhenrxyynfddoyjeunrnnlxxyohd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718658.06715-1531-117247552470705/AnsiballZ_stat.py'
Dec 02 23:37:38 compute-1 sudo[147331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:38 compute-1 python3.9[147333]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:38 compute-1 sudo[147331]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:39 compute-1 sudo[147454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvqsjrnoiyztywpapfcqzeuiktwchvrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718658.06715-1531-117247552470705/AnsiballZ_copy.py'
Dec 02 23:37:39 compute-1 sudo[147454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:39 compute-1 python3.9[147456]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718658.06715-1531-117247552470705/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:39 compute-1 sudo[147454]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:39 compute-1 sudo[147606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojsfptffklkmxovqvxrxvgyjwugcwmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718659.4378285-1531-275777222769181/AnsiballZ_stat.py'
Dec 02 23:37:39 compute-1 sudo[147606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:39 compute-1 python3.9[147608]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:39 compute-1 sudo[147606]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:40 compute-1 sudo[147729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usbneihnhthijscvcjybeoibhlailarb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718659.4378285-1531-275777222769181/AnsiballZ_copy.py'
Dec 02 23:37:40 compute-1 sudo[147729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:40 compute-1 python3.9[147731]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718659.4378285-1531-275777222769181/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:40 compute-1 sudo[147729]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:41 compute-1 sudo[147881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnvxzeeazzztikkyhqunqljitvwliwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718660.805883-1531-32258858065749/AnsiballZ_stat.py'
Dec 02 23:37:41 compute-1 sudo[147881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:41 compute-1 python3.9[147883]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:41 compute-1 sudo[147881]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:41 compute-1 sudo[148004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpctvvcdgzekqslllifayucxbqqcwfty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718660.805883-1531-32258858065749/AnsiballZ_copy.py'
Dec 02 23:37:41 compute-1 sudo[148004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:42 compute-1 python3.9[148006]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718660.805883-1531-32258858065749/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:42 compute-1 sudo[148004]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:42 compute-1 sudo[148156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqdeuevvtycbtoyundyxtdzuacwqnpcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718662.303722-1531-151304469087269/AnsiballZ_stat.py'
Dec 02 23:37:42 compute-1 sudo[148156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:42 compute-1 python3.9[148158]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:42 compute-1 sudo[148156]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:43 compute-1 sudo[148279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isfspopxmearuwvwitmxfiohizhjgsnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718662.303722-1531-151304469087269/AnsiballZ_copy.py'
Dec 02 23:37:43 compute-1 sudo[148279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:43 compute-1 python3.9[148281]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718662.303722-1531-151304469087269/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:43 compute-1 sudo[148279]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:44 compute-1 sudo[148431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcxuafobgmkzngeyiziephltbcdxrphu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718663.750589-1531-279366736715638/AnsiballZ_stat.py'
Dec 02 23:37:44 compute-1 sudo[148431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:44 compute-1 python3.9[148433]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:44 compute-1 sudo[148431]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:44 compute-1 sudo[148554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jriyilevahesllwrowufnutgcwnpmowb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718663.750589-1531-279366736715638/AnsiballZ_copy.py'
Dec 02 23:37:44 compute-1 sudo[148554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:44 compute-1 python3.9[148556]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718663.750589-1531-279366736715638/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:44 compute-1 sudo[148554]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:45 compute-1 sudo[148706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waduykbnogfvacrrvettmkhclhdjhkgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718665.1424668-1531-270080189201714/AnsiballZ_stat.py'
Dec 02 23:37:45 compute-1 sudo[148706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:45 compute-1 python3.9[148708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:45 compute-1 sudo[148706]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:45 compute-1 sudo[148829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voiwqmhcjshxgpxbwamjhlezuvcctmzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718665.1424668-1531-270080189201714/AnsiballZ_copy.py'
Dec 02 23:37:45 compute-1 sudo[148829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:46 compute-1 python3.9[148831]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718665.1424668-1531-270080189201714/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:46 compute-1 sudo[148829]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:46 compute-1 sudo[148981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kobnerffeomglqjgfkpeffbxwkieeodu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718666.2827232-1531-214227715907767/AnsiballZ_stat.py'
Dec 02 23:37:46 compute-1 sudo[148981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:46 compute-1 python3.9[148983]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:46 compute-1 sudo[148981]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:47 compute-1 sudo[149104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umquyoxcqqwyoduhpwsotctsewidrrel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718666.2827232-1531-214227715907767/AnsiballZ_copy.py'
Dec 02 23:37:47 compute-1 sudo[149104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:47 compute-1 python3.9[149106]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718666.2827232-1531-214227715907767/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:47 compute-1 sudo[149104]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:47 compute-1 sudo[149256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uefxxqeppppuizccebjccvgsmpenbqov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718667.441747-1531-2021617579902/AnsiballZ_stat.py'
Dec 02 23:37:47 compute-1 sudo[149256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:47 compute-1 python3.9[149258]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:47 compute-1 sudo[149256]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:48 compute-1 sudo[149379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cglzbeyzkqughclbtludteyswtcczxrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718667.441747-1531-2021617579902/AnsiballZ_copy.py'
Dec 02 23:37:48 compute-1 sudo[149379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:48 compute-1 python3.9[149381]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718667.441747-1531-2021617579902/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:48 compute-1 sudo[149379]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:49 compute-1 sudo[149531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wafoqofzxkyhomazbukhnqxnvttshnfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718668.7790775-1531-14223436962222/AnsiballZ_stat.py'
Dec 02 23:37:49 compute-1 sudo[149531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:49 compute-1 python3.9[149533]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:49 compute-1 sudo[149531]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:49 compute-1 sudo[149654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpesmwoxjletzbidbeliqzhbrwgiqucc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718668.7790775-1531-14223436962222/AnsiballZ_copy.py'
Dec 02 23:37:49 compute-1 sudo[149654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:50 compute-1 python3.9[149656]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718668.7790775-1531-14223436962222/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:50 compute-1 sudo[149654]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:50 compute-1 sudo[149806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cthmlfhrnjanumkqgqmxcqocrutdidxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718670.266104-1531-126042741508932/AnsiballZ_stat.py'
Dec 02 23:37:50 compute-1 sudo[149806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:50 compute-1 python3.9[149808]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:50 compute-1 sudo[149806]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:51 compute-1 sudo[149929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiqzwnhoqfgepdmhtmgxcdnyauawfgli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718670.266104-1531-126042741508932/AnsiballZ_copy.py'
Dec 02 23:37:51 compute-1 sudo[149929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:51 compute-1 python3.9[149931]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718670.266104-1531-126042741508932/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:51 compute-1 sudo[149929]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:51 compute-1 sudo[150081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyxnfwfwcveyliwuqorofnqmmkbcwufp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718671.518071-1531-266853286759066/AnsiballZ_stat.py'
Dec 02 23:37:51 compute-1 sudo[150081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:52 compute-1 python3.9[150083]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:52 compute-1 sudo[150081]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:52 compute-1 sudo[150204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqaydxtudxyygffeofpxnhuidckprlhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718671.518071-1531-266853286759066/AnsiballZ_copy.py'
Dec 02 23:37:52 compute-1 sudo[150204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:52 compute-1 python3.9[150206]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718671.518071-1531-266853286759066/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:52 compute-1 sudo[150204]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:52 compute-1 sudo[150356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmdkglblgjurhsculmxupdqlinaqgvtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718672.7004936-1531-250688193165860/AnsiballZ_stat.py'
Dec 02 23:37:52 compute-1 sudo[150356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:53 compute-1 python3.9[150358]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:53 compute-1 sudo[150356]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:53 compute-1 sudo[150479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otmcxjouddmqblpqehtzvwbsokuyfvyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718672.7004936-1531-250688193165860/AnsiballZ_copy.py'
Dec 02 23:37:53 compute-1 sudo[150479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:53 compute-1 python3.9[150481]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718672.7004936-1531-250688193165860/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:53 compute-1 sudo[150479]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:54 compute-1 sudo[150631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekwpxbspodgahsuhqildzayyaekdbeva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718673.9660487-1531-227771922209962/AnsiballZ_stat.py'
Dec 02 23:37:54 compute-1 sudo[150631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:54 compute-1 python3.9[150633]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:54 compute-1 sudo[150631]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:54 compute-1 sudo[150754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zykyzhrbqcmyakhrsjbyrbkgcpcszkkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718673.9660487-1531-227771922209962/AnsiballZ_copy.py'
Dec 02 23:37:54 compute-1 sudo[150754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:55 compute-1 python3.9[150756]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718673.9660487-1531-227771922209962/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:55 compute-1 sudo[150754]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:56 compute-1 python3.9[150906]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:37:56 compute-1 sudo[151059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grmaeomaavnezkfjvimthyobmjmkeizd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718676.4203238-1943-191477716181724/AnsiballZ_seboolean.py'
Dec 02 23:37:56 compute-1 sudo[151059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:57 compute-1 python3.9[151061]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 02 23:37:58 compute-1 sudo[151059]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:59 compute-1 sudo[151215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-endqegsaqluzzawzfvassudvrlimuggs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718678.7090743-1959-264949108624522/AnsiballZ_copy.py'
Dec 02 23:37:59 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 02 23:37:59 compute-1 sudo[151215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:59 compute-1 python3.9[151217]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:59 compute-1 sudo[151215]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:59 compute-1 sudo[151367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twrprogumzmxxuwrgullzwbaqjmcdoag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718679.4550054-1959-181252061338939/AnsiballZ_copy.py'
Dec 02 23:37:59 compute-1 sudo[151367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:00 compute-1 python3.9[151369]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:00 compute-1 sudo[151367]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:00 compute-1 sudo[151519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lihnavwudwhyowmwvdzefthccpyionuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718680.2573905-1959-216366056532087/AnsiballZ_copy.py'
Dec 02 23:38:00 compute-1 sudo[151519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:00 compute-1 podman[151521]: 2025-12-02 23:38:00.808417399 +0000 UTC m=+0.134064568 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 23:38:00 compute-1 python3.9[151522]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:00 compute-1 sudo[151519]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:01 compute-1 sudo[151697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sywbdellhunbgetlrayttvqpcpvyezij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718681.0981913-1959-108292649960309/AnsiballZ_copy.py'
Dec 02 23:38:01 compute-1 sudo[151697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:38:01.660 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:38:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:38:01.660 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:38:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:38:01.660 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:38:01 compute-1 python3.9[151699]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:01 compute-1 sudo[151697]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:02 compute-1 sudo[151850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gareiakwkskorfffaswvbvjagbutsoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718681.9660356-1959-213259730566865/AnsiballZ_copy.py'
Dec 02 23:38:02 compute-1 sudo[151850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:02 compute-1 python3.9[151852]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:02 compute-1 sudo[151850]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:03 compute-1 sudo[152002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmvwbhyimgaotxniixaideypfnmhakaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718682.795229-2031-260233863427880/AnsiballZ_copy.py'
Dec 02 23:38:03 compute-1 sudo[152002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:03 compute-1 python3.9[152004]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:03 compute-1 sudo[152002]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:04 compute-1 sudo[152154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsupyhlwzdfnobnflknnarwkoeifdfrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718683.6339238-2031-97398227753741/AnsiballZ_copy.py'
Dec 02 23:38:04 compute-1 sudo[152154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:04 compute-1 python3.9[152156]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:04 compute-1 sudo[152154]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:04 compute-1 podman[152157]: 2025-12-02 23:38:04.324923914 +0000 UTC m=+0.081876219 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent)
Dec 02 23:38:04 compute-1 sudo[152325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fixmaujisvonsaknrbcwrygcoqxqcoiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718684.4404774-2031-161093096116333/AnsiballZ_copy.py'
Dec 02 23:38:04 compute-1 sudo[152325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:04 compute-1 python3.9[152327]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:04 compute-1 sudo[152325]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:05 compute-1 sudo[152477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnxlsqqmxyaebjhieqabufxagrydpug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718685.1438825-2031-266780645033935/AnsiballZ_copy.py'
Dec 02 23:38:05 compute-1 sudo[152477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:05 compute-1 python3.9[152479]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:05 compute-1 sudo[152477]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:06 compute-1 sudo[152629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvjdlzeyyffzhkghkmxgonunharjrccr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718685.9385436-2031-41117779786215/AnsiballZ_copy.py'
Dec 02 23:38:06 compute-1 sudo[152629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:06 compute-1 python3.9[152631]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:06 compute-1 sudo[152629]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:07 compute-1 sudo[152781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-synwdhbrntxmoqqitqvxmfkdpjotaszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718686.854051-2103-277805939239182/AnsiballZ_systemd.py'
Dec 02 23:38:07 compute-1 sudo[152781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:07 compute-1 python3.9[152783]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:07 compute-1 systemd[1]: Reloading.
Dec 02 23:38:07 compute-1 systemd-sysv-generator[152816]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:07 compute-1 systemd-rc-local-generator[152812]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:07 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Dec 02 23:38:07 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Dec 02 23:38:07 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 02 23:38:07 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 02 23:38:07 compute-1 systemd[1]: Starting libvirt logging daemon...
Dec 02 23:38:07 compute-1 systemd[1]: Started libvirt logging daemon.
Dec 02 23:38:08 compute-1 sudo[152781]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:08 compute-1 sudo[152975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzakyzbqmxjilcvfycuwbkbopkndmdnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718688.2184095-2103-207557396874555/AnsiballZ_systemd.py'
Dec 02 23:38:08 compute-1 sudo[152975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:08 compute-1 python3.9[152977]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:08 compute-1 systemd[1]: Reloading.
Dec 02 23:38:08 compute-1 systemd-sysv-generator[153008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:08 compute-1 systemd-rc-local-generator[153002]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:09 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 02 23:38:09 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 02 23:38:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 02 23:38:09 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 02 23:38:09 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 02 23:38:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 02 23:38:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 02 23:38:09 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Dec 02 23:38:09 compute-1 systemd[1]: Started libvirt nodedev daemon.
Dec 02 23:38:09 compute-1 sudo[152975]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:09 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 02 23:38:09 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 02 23:38:09 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 02 23:38:09 compute-1 sudo[153198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuuxmhfyzzgtnwghxcnftgenzhxnhjoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718689.52249-2103-167341937200290/AnsiballZ_systemd.py'
Dec 02 23:38:09 compute-1 sudo[153198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:10 compute-1 python3.9[153200]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:10 compute-1 systemd[1]: Reloading.
Dec 02 23:38:10 compute-1 systemd-rc-local-generator[153226]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:10 compute-1 systemd-sysv-generator[153230]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:10 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 02 23:38:10 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 02 23:38:10 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 02 23:38:10 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 02 23:38:10 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 02 23:38:10 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 02 23:38:10 compute-1 sudo[153198]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:10 compute-1 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l ec863239-e948-497a-82ef-6baa8a71c5df
Dec 02 23:38:10 compute-1 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 02 23:38:10 compute-1 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l ec863239-e948-497a-82ef-6baa8a71c5df
Dec 02 23:38:10 compute-1 setroubleshoot[153013]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 02 23:38:11 compute-1 sudo[153412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmnlolxdqqhqxksvplwtyjnqzgkkduws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718690.988911-2103-229496595545083/AnsiballZ_systemd.py'
Dec 02 23:38:11 compute-1 sudo[153412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:11 compute-1 python3.9[153414]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:11 compute-1 systemd[1]: Reloading.
Dec 02 23:38:11 compute-1 systemd-rc-local-generator[153435]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:11 compute-1 systemd-sysv-generator[153442]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:12 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Dec 02 23:38:12 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 02 23:38:12 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 02 23:38:12 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 02 23:38:12 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 02 23:38:12 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 02 23:38:12 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 02 23:38:12 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 02 23:38:12 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 02 23:38:12 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 02 23:38:12 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Dec 02 23:38:12 compute-1 systemd[1]: Started libvirt QEMU daemon.
Dec 02 23:38:12 compute-1 sudo[153412]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:12 compute-1 sudo[153627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqjoylqtrvjcwkqafldyauwruypwnxay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718692.4091308-2103-120012688634657/AnsiballZ_systemd.py'
Dec 02 23:38:12 compute-1 sudo[153627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:13 compute-1 python3.9[153629]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:13 compute-1 systemd[1]: Reloading.
Dec 02 23:38:13 compute-1 systemd-rc-local-generator[153655]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:13 compute-1 systemd-sysv-generator[153660]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:13 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Dec 02 23:38:13 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Dec 02 23:38:13 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 02 23:38:13 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 02 23:38:13 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 02 23:38:13 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 02 23:38:13 compute-1 systemd[1]: Starting libvirt secret daemon...
Dec 02 23:38:13 compute-1 systemd[1]: Started libvirt secret daemon.
Dec 02 23:38:13 compute-1 sudo[153627]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:14 compute-1 sudo[153838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiyrdljmrajtlfnezlkmfvdpniumkbmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718693.987354-2177-250223019878280/AnsiballZ_file.py'
Dec 02 23:38:14 compute-1 sudo[153838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:14 compute-1 python3.9[153840]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:14 compute-1 sudo[153838]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:15 compute-1 sudo[153990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyhbjvuviyultmgbqvgqztbxyqbwbrov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718694.8847437-2193-94078502620198/AnsiballZ_find.py'
Dec 02 23:38:15 compute-1 sudo[153990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:15 compute-1 python3.9[153992]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:38:15 compute-1 sudo[153990]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:16 compute-1 sudo[154142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxijxxprwbogpjkcuisgxnxnjutstmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718695.9915667-2221-37568293577358/AnsiballZ_stat.py'
Dec 02 23:38:16 compute-1 sudo[154142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:16 compute-1 python3.9[154144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:16 compute-1 sudo[154142]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:17 compute-1 sudo[154265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emlqhyihrzufwoflxcklynuzkgyoxvqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718695.9915667-2221-37568293577358/AnsiballZ_copy.py'
Dec 02 23:38:17 compute-1 sudo[154265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:17 compute-1 python3.9[154267]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718695.9915667-2221-37568293577358/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:17 compute-1 sudo[154265]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:17 compute-1 sudo[154417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijuxluogxwiuepwusyrxleiuttbcrsfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718697.6043825-2253-173977023492587/AnsiballZ_file.py'
Dec 02 23:38:17 compute-1 sudo[154417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:18 compute-1 python3.9[154419]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:18 compute-1 sudo[154417]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:18 compute-1 sudo[154569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkrvtfebzlrlwyvfyiexkcnblafxypgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718698.5089467-2269-265132909361307/AnsiballZ_stat.py'
Dec 02 23:38:18 compute-1 sudo[154569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:19 compute-1 python3.9[154571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:19 compute-1 sudo[154569]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:19 compute-1 sudo[154647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sirwsorhbstzjdcwawmewhpjophxbabz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718698.5089467-2269-265132909361307/AnsiballZ_file.py'
Dec 02 23:38:19 compute-1 sudo[154647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:19 compute-1 python3.9[154649]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:19 compute-1 sudo[154647]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:20 compute-1 sudo[154799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmmteupeqcltptbisaqlyvzjcwxmludm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718700.003798-2293-180693875527895/AnsiballZ_stat.py'
Dec 02 23:38:20 compute-1 sudo[154799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:20 compute-1 python3.9[154801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:20 compute-1 sudo[154799]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:20 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 02 23:38:20 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.133s CPU time.
Dec 02 23:38:20 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 02 23:38:20 compute-1 sudo[154877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niodeuhkrzckwguhrxcugbkzucekodzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718700.003798-2293-180693875527895/AnsiballZ_file.py'
Dec 02 23:38:20 compute-1 sudo[154877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:21 compute-1 python3.9[154879]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tl33urv7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:21 compute-1 sudo[154877]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:21 compute-1 sshd-session[154974]: Connection closed by 193.32.162.146 port 55464
Dec 02 23:38:21 compute-1 sudo[155030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdmnxfkybagtmmdndzddqfuogbdtainu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718701.4212935-2317-230400806680193/AnsiballZ_stat.py'
Dec 02 23:38:21 compute-1 sudo[155030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:22 compute-1 python3.9[155032]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:22 compute-1 sudo[155030]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:22 compute-1 sudo[155108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbesgknoeusxzhvojkwqukmrqjdpkztn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718701.4212935-2317-230400806680193/AnsiballZ_file.py'
Dec 02 23:38:22 compute-1 sudo[155108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:22 compute-1 python3.9[155110]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:22 compute-1 sudo[155108]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:23 compute-1 sudo[155260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yylyaaajlmkkmmwhuqabswbuadfkgmqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718702.921556-2343-124998166106193/AnsiballZ_command.py'
Dec 02 23:38:23 compute-1 sudo[155260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:23 compute-1 python3.9[155262]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:23 compute-1 sudo[155260]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:24 compute-1 sudo[155413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlcqkdcxgqmlinpqofftgbvyinqaisqv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718703.7890007-2359-96320930085012/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:38:24 compute-1 sudo[155413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:24 compute-1 python3[155415]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:38:24 compute-1 sudo[155413]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:25 compute-1 sudo[155565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmjtyqzgouleitdxagtwgjbepjzumpum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718704.7620049-2375-55447940414708/AnsiballZ_stat.py'
Dec 02 23:38:25 compute-1 sudo[155565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:25 compute-1 python3.9[155567]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:25 compute-1 sudo[155565]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:25 compute-1 sudo[155643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddubqgclgtemvfbvckkugijunugntymr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718704.7620049-2375-55447940414708/AnsiballZ_file.py'
Dec 02 23:38:25 compute-1 sudo[155643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:25 compute-1 python3.9[155645]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:25 compute-1 sudo[155643]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:26 compute-1 sudo[155795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kusqjxmfceuimlxmshtvsupazqiperag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718706.2302246-2399-225275905026156/AnsiballZ_stat.py'
Dec 02 23:38:26 compute-1 sudo[155795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:26 compute-1 python3.9[155797]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:26 compute-1 sudo[155795]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:27 compute-1 sudo[155873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ombpwaymzhbczlkjjbdeeroeozjklbss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718706.2302246-2399-225275905026156/AnsiballZ_file.py'
Dec 02 23:38:27 compute-1 sudo[155873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:27 compute-1 python3.9[155875]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:27 compute-1 sudo[155873]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:28 compute-1 sudo[156025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lskazfznjnlerrbrbazqmjtdomipoilz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718707.749108-2423-209294958232130/AnsiballZ_stat.py'
Dec 02 23:38:28 compute-1 sudo[156025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:28 compute-1 python3.9[156027]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:28 compute-1 sudo[156025]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:28 compute-1 sudo[156103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snoykoibapxsglednyrbtzkesyclpbhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718707.749108-2423-209294958232130/AnsiballZ_file.py'
Dec 02 23:38:28 compute-1 sudo[156103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:28 compute-1 python3.9[156105]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:28 compute-1 sudo[156103]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:29 compute-1 sudo[156255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvmkxhgsiwbsafowdxmzydbjqmoohelz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718709.1958838-2447-157692943263882/AnsiballZ_stat.py'
Dec 02 23:38:29 compute-1 sudo[156255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:29 compute-1 python3.9[156257]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:29 compute-1 sudo[156255]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:30 compute-1 sudo[156333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thzlyuakygvmxuapuxfmymjmdyselqxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718709.1958838-2447-157692943263882/AnsiballZ_file.py'
Dec 02 23:38:30 compute-1 sudo[156333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:30 compute-1 python3.9[156335]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:30 compute-1 sudo[156333]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:31 compute-1 sudo[156503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fynzmzpffhpsdywvsrpvibrrtafblaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718710.638548-2471-186087510790887/AnsiballZ_stat.py'
Dec 02 23:38:31 compute-1 sudo[156503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:31 compute-1 podman[156459]: 2025-12-02 23:38:31.127179679 +0000 UTC m=+0.128309256 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 02 23:38:31 compute-1 python3.9[156508]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:31 compute-1 sudo[156503]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:31 compute-1 sudo[156638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olupaipdgnvxmixgvmpjrkyxrizwaohb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718710.638548-2471-186087510790887/AnsiballZ_copy.py'
Dec 02 23:38:31 compute-1 sudo[156638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:32 compute-1 python3.9[156640]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718710.638548-2471-186087510790887/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:32 compute-1 sudo[156638]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:32 compute-1 sudo[156791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiazetkclggfsbaizoohdpexwowslvdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718712.3599675-2501-189106072568181/AnsiballZ_file.py'
Dec 02 23:38:32 compute-1 sudo[156791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:33 compute-1 python3.9[156793]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:33 compute-1 sshd-session[156587]: Invalid user prueba from 185.156.73.233 port 49286
Dec 02 23:38:33 compute-1 sudo[156791]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:33 compute-1 sshd-session[156587]: Connection closed by invalid user prueba 185.156.73.233 port 49286 [preauth]
Dec 02 23:38:33 compute-1 sudo[156943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-refiofkbaihgsuovogdjjgclhkwvbzzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718713.3166378-2518-28193944697034/AnsiballZ_command.py'
Dec 02 23:38:33 compute-1 sudo[156943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:33 compute-1 python3.9[156945]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:33 compute-1 sudo[156943]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:34 compute-1 sudo[157111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfyngwaqivsfccnqcykedygiehhouiru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718714.028618-2533-85309965851431/AnsiballZ_blockinfile.py'
Dec 02 23:38:34 compute-1 sudo[157111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:34 compute-1 podman[157072]: 2025-12-02 23:38:34.488059408 +0000 UTC m=+0.062984199 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 02 23:38:34 compute-1 python3.9[157119]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:34 compute-1 sudo[157111]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:35 compute-1 sudo[157269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxfxlnipiprvtndpdyzvwbkqbngvlqfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718714.954315-2551-209688807946578/AnsiballZ_command.py'
Dec 02 23:38:35 compute-1 sudo[157269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:35 compute-1 python3.9[157271]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:35 compute-1 sudo[157269]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:36 compute-1 sudo[157422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idsicbpoebxvakqtlwjbsdwztavgcrxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718715.7292933-2567-267089521787073/AnsiballZ_stat.py'
Dec 02 23:38:36 compute-1 sudo[157422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:36 compute-1 python3.9[157424]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:38:36 compute-1 sudo[157422]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:36 compute-1 sudo[157576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icomceuyuyznzfftfilmatxkvnhqarwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718716.4407852-2583-222180033169612/AnsiballZ_command.py'
Dec 02 23:38:36 compute-1 sudo[157576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:37 compute-1 python3.9[157578]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:37 compute-1 sudo[157576]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:37 compute-1 sudo[157731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eysjyofhfqivlnkecapmbtphvbknsfjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718717.3733704-2600-118157335237089/AnsiballZ_file.py'
Dec 02 23:38:37 compute-1 sudo[157731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:37 compute-1 python3.9[157733]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:37 compute-1 sudo[157731]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:38 compute-1 sudo[157883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtsuzciuvhcgbeuuotfbpazicpkbcgat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718718.1494734-2615-119636700578509/AnsiballZ_stat.py'
Dec 02 23:38:38 compute-1 sudo[157883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:38 compute-1 python3.9[157885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:38 compute-1 sudo[157883]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:39 compute-1 sudo[158006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcfilmqrzxhhstuxlglghfrxrajvvpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718718.1494734-2615-119636700578509/AnsiballZ_copy.py'
Dec 02 23:38:39 compute-1 sudo[158006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:39 compute-1 python3.9[158008]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718718.1494734-2615-119636700578509/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:39 compute-1 sudo[158006]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:40 compute-1 sudo[158158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jitbcbykhdzrptgaanfzlngmionhaidh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718719.7123148-2646-278905144589516/AnsiballZ_stat.py'
Dec 02 23:38:40 compute-1 sudo[158158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:40 compute-1 python3.9[158160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:40 compute-1 sudo[158158]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:40 compute-1 sudo[158281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glasjmubwfhsmzphllmfefsxfkdczyul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718719.7123148-2646-278905144589516/AnsiballZ_copy.py'
Dec 02 23:38:40 compute-1 sudo[158281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:40 compute-1 python3.9[158283]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718719.7123148-2646-278905144589516/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:41 compute-1 sudo[158281]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:41 compute-1 sudo[158433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdpplpzakqlhoqojbrewxpobgxkicwlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718721.2716298-2675-56818030568862/AnsiballZ_stat.py'
Dec 02 23:38:41 compute-1 sudo[158433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:41 compute-1 python3.9[158435]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:41 compute-1 sudo[158433]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:42 compute-1 sudo[158556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbcqmpznlswaopbpbhdiztlaociywyht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718721.2716298-2675-56818030568862/AnsiballZ_copy.py'
Dec 02 23:38:42 compute-1 sudo[158556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:42 compute-1 python3.9[158558]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718721.2716298-2675-56818030568862/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:42 compute-1 sudo[158556]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:43 compute-1 sudo[158708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oulnldxfabkfnbrideeaawfxorrfclxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718722.7902741-2705-133645247298589/AnsiballZ_systemd.py'
Dec 02 23:38:43 compute-1 sudo[158708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:43 compute-1 python3.9[158710]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:38:43 compute-1 systemd[1]: Reloading.
Dec 02 23:38:43 compute-1 systemd-rc-local-generator[158737]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:43 compute-1 systemd-sysv-generator[158742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:43 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Dec 02 23:38:43 compute-1 sudo[158708]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:44 compute-1 sudo[158900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzlyijmhccvlhdnfnfafefobbaokvpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718724.1980453-2721-211504273870993/AnsiballZ_systemd.py'
Dec 02 23:38:44 compute-1 sudo[158900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:44 compute-1 python3.9[158902]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 23:38:44 compute-1 systemd[1]: Reloading.
Dec 02 23:38:45 compute-1 systemd-rc-local-generator[158929]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:45 compute-1 systemd-sysv-generator[158933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:45 compute-1 systemd[1]: Reloading.
Dec 02 23:38:45 compute-1 systemd-rc-local-generator[158967]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:45 compute-1 systemd-sysv-generator[158970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:45 compute-1 sudo[158900]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:46 compute-1 sshd-session[104472]: Connection closed by 192.168.122.30 port 57342
Dec 02 23:38:46 compute-1 sshd-session[104469]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:38:46 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Dec 02 23:38:46 compute-1 systemd[1]: session-23.scope: Consumed 3min 48.997s CPU time.
Dec 02 23:38:46 compute-1 systemd-logind[790]: Session 23 logged out. Waiting for processes to exit.
Dec 02 23:38:46 compute-1 systemd-logind[790]: Removed session 23.
Dec 02 23:38:51 compute-1 sshd-session[158999]: Accepted publickey for zuul from 192.168.122.30 port 46660 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:38:51 compute-1 systemd-logind[790]: New session 24 of user zuul.
Dec 02 23:38:51 compute-1 systemd[1]: Started Session 24 of User zuul.
Dec 02 23:38:51 compute-1 sshd-session[158999]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:38:52 compute-1 python3.9[159152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:38:53 compute-1 python3.9[159306]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:38:54 compute-1 network[159323]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:38:54 compute-1 network[159324]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:38:54 compute-1 network[159325]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:39:00 compute-1 sudo[159594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcwkutparbrvnyzgvxasgnjlpjbenxjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718740.0922232-75-200247082111120/AnsiballZ_setup.py'
Dec 02 23:39:00 compute-1 sudo[159594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:00 compute-1 python3.9[159596]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:39:00 compute-1 sudo[159594]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:01 compute-1 sudo[159693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbnnuvjkdbjridjkqhmjdwmybyzundix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718740.0922232-75-200247082111120/AnsiballZ_dnf.py'
Dec 02 23:39:01 compute-1 sudo[159693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:01 compute-1 podman[159652]: 2025-12-02 23:39:01.56817442 +0000 UTC m=+0.129020565 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 23:39:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:39:01.661 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:39:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:39:01.662 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:39:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:39:01.662 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:39:01 compute-1 python3.9[159700]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:39:05 compute-1 podman[159711]: 2025-12-02 23:39:05.258084748 +0000 UTC m=+0.081977193 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 02 23:39:07 compute-1 sudo[159693]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:07 compute-1 sudo[159879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvbhauevbeviuxlhkwzgohdjvbkiznkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718747.3269124-99-21480686271890/AnsiballZ_stat.py'
Dec 02 23:39:07 compute-1 sudo[159879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:07 compute-1 python3.9[159881]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:08 compute-1 sudo[159879]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:08 compute-1 sudo[160031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttelpmhnymhkoytxgbdtuygmlrzawwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718748.3639903-119-193623392895044/AnsiballZ_command.py'
Dec 02 23:39:08 compute-1 sudo[160031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:09 compute-1 python3.9[160033]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:39:09 compute-1 sudo[160031]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:09 compute-1 sudo[160184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpahmglirwaqelfernroicbeoxniglzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718749.4171367-139-44852955046498/AnsiballZ_stat.py'
Dec 02 23:39:09 compute-1 sudo[160184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:10 compute-1 python3.9[160186]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:10 compute-1 sudo[160184]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:10 compute-1 sudo[160336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoqczwabxcjhtrcklylyauzxxyqyrwqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718750.308038-155-268710557415152/AnsiballZ_command.py'
Dec 02 23:39:10 compute-1 sudo[160336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:10 compute-1 python3.9[160338]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:39:10 compute-1 sudo[160336]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:11 compute-1 sudo[160489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtaqgfmcdswbjfdvfhpvvbvaocrtyktf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718751.114899-171-167215584900278/AnsiballZ_stat.py'
Dec 02 23:39:11 compute-1 sudo[160489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:11 compute-1 python3.9[160491]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:11 compute-1 sudo[160489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:12 compute-1 sudo[160612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxdyhdkwoykhbksvyrwxjytzbgeqqljn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718751.114899-171-167215584900278/AnsiballZ_copy.py'
Dec 02 23:39:12 compute-1 sudo[160612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:12 compute-1 python3.9[160614]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718751.114899-171-167215584900278/.source.iscsi _original_basename=.gizl8688 follow=False checksum=64a99d1e29a0ec92d5736326d23453244039a734 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:12 compute-1 sudo[160612]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:13 compute-1 sudo[160764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiemytowrllhvhzzojxivgktoeupzfqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718752.6071532-201-198867292792821/AnsiballZ_file.py'
Dec 02 23:39:13 compute-1 sudo[160764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:13 compute-1 python3.9[160766]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:13 compute-1 sudo[160764]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:14 compute-1 sudo[160916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncinyfndiufctpilkfvabyxnkmkezeos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718753.5919712-217-244547925369961/AnsiballZ_lineinfile.py'
Dec 02 23:39:14 compute-1 sudo[160916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:14 compute-1 python3.9[160918]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:14 compute-1 sudo[160916]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:14 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:39:15 compute-1 sudo[161069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjqekuwmytrqzgmvvgjbczlitqjfevnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718754.688821-235-219353101997759/AnsiballZ_systemd_service.py'
Dec 02 23:39:15 compute-1 sudo[161069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:15 compute-1 python3.9[161071]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:16 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 02 23:39:16 compute-1 sudo[161069]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:17 compute-1 sudo[161225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpytltkgabzscdodexteunvvkdbptyis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718756.9802861-251-74064446558452/AnsiballZ_systemd_service.py'
Dec 02 23:39:17 compute-1 sudo[161225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:17 compute-1 python3.9[161227]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:17 compute-1 systemd[1]: Reloading.
Dec 02 23:39:17 compute-1 systemd-rc-local-generator[161256]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:39:17 compute-1 systemd-sysv-generator[161260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:39:17 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 02 23:39:17 compute-1 systemd[1]: Starting Open-iSCSI...
Dec 02 23:39:18 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Dec 02 23:39:18 compute-1 systemd[1]: Started Open-iSCSI.
Dec 02 23:39:18 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 02 23:39:18 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 02 23:39:18 compute-1 sudo[161225]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:18 compute-1 sudo[161425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiaszrdvctsrmufwbiithsiispgotbvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718758.5721843-273-131223158974306/AnsiballZ_service_facts.py'
Dec 02 23:39:18 compute-1 sudo[161425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:19 compute-1 python3.9[161427]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:39:19 compute-1 network[161444]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:39:19 compute-1 network[161445]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:39:19 compute-1 network[161446]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:39:22 compute-1 sudo[161425]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:24 compute-1 sudo[161715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uidybxgnsiuafyiwnsprepnlpuhotxbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718763.9391892-293-171235664326315/AnsiballZ_file.py'
Dec 02 23:39:24 compute-1 sudo[161715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:24 compute-1 python3.9[161717]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 23:39:24 compute-1 sudo[161715]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:25 compute-1 sudo[161868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cohihtgejtzucpavtqobueyxnjhgapua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718764.8781922-309-270785047916329/AnsiballZ_modprobe.py'
Dec 02 23:39:25 compute-1 sudo[161868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:25 compute-1 python3.9[161870]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 02 23:39:25 compute-1 sudo[161868]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:26 compute-1 sudo[162024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuvnnrlflnpzvkdnjyqfihmhuwqtscjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718765.7834165-325-127000508295617/AnsiballZ_stat.py'
Dec 02 23:39:26 compute-1 sudo[162024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:26 compute-1 python3.9[162026]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:26 compute-1 sudo[162024]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:26 compute-1 sudo[162147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtvoifykzhndffsknfvzhxdyihjowumd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718765.7834165-325-127000508295617/AnsiballZ_copy.py'
Dec 02 23:39:26 compute-1 sudo[162147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:26 compute-1 python3.9[162149]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718765.7834165-325-127000508295617/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:26 compute-1 sudo[162147]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:27 compute-1 sudo[162299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odkqmvsifjhzrdmnqzvyfrfhceiibahc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718767.2612784-357-131812127258819/AnsiballZ_lineinfile.py'
Dec 02 23:39:27 compute-1 sudo[162299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:27 compute-1 python3.9[162301]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:27 compute-1 sudo[162299]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:28 compute-1 sudo[162451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdyqafncckiunotvgvrxwxrerqmpqhfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718768.188254-373-280175030789495/AnsiballZ_systemd.py'
Dec 02 23:39:28 compute-1 sudo[162451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:29 compute-1 python3.9[162453]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:39:29 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 23:39:29 compute-1 systemd[1]: Stopped Load Kernel Modules.
Dec 02 23:39:29 compute-1 systemd[1]: Stopping Load Kernel Modules...
Dec 02 23:39:29 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 02 23:39:29 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 02 23:39:29 compute-1 sudo[162451]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:29 compute-1 sudo[162607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-humcqeehvdhzmhxklcnvwivrfbozzsav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718769.513108-389-170261826800586/AnsiballZ_file.py'
Dec 02 23:39:29 compute-1 sudo[162607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:30 compute-1 python3.9[162609]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:30 compute-1 sudo[162607]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:30 compute-1 sudo[162759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puezokivimekffzcixksysybsfqbwlzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718770.5670512-407-149424823536773/AnsiballZ_stat.py'
Dec 02 23:39:30 compute-1 sudo[162759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:31 compute-1 python3.9[162761]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:31 compute-1 sudo[162759]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:31 compute-1 sudo[162920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uomcqejsrnvfknzkobweohfwvbkntkpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718771.4363422-425-72664212475451/AnsiballZ_stat.py'
Dec 02 23:39:31 compute-1 sudo[162920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:31 compute-1 podman[162885]: 2025-12-02 23:39:31.941833934 +0000 UTC m=+0.153243888 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:39:32 compute-1 python3.9[162930]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:32 compute-1 sudo[162920]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:32 compute-1 sudo[163090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjmushcqfhrwuetcplwdqzvoltoqlve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718772.4453213-441-61273848541704/AnsiballZ_stat.py'
Dec 02 23:39:32 compute-1 sudo[163090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:32 compute-1 python3.9[163092]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:32 compute-1 sudo[163090]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:33 compute-1 sudo[163213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cocckszlgkjxnuhszfpayuxqggnfrhiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718772.4453213-441-61273848541704/AnsiballZ_copy.py'
Dec 02 23:39:33 compute-1 sudo[163213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:33 compute-1 python3.9[163215]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718772.4453213-441-61273848541704/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:33 compute-1 sudo[163213]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:35 compute-1 sudo[163366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxcijfffftnontqtlykqtzaaperubppo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718774.0148544-471-270698912529757/AnsiballZ_command.py'
Dec 02 23:39:35 compute-1 sudo[163366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:35 compute-1 python3.9[163368]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:39:35 compute-1 sudo[163366]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:35 compute-1 podman[163370]: 2025-12-02 23:39:35.407535668 +0000 UTC m=+0.082789043 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:39:35 compute-1 sudo[163539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqttghsgdxqozjcymfobrsnkhtupbktw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718775.5090575-487-229048868536036/AnsiballZ_lineinfile.py'
Dec 02 23:39:35 compute-1 sudo[163539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:36 compute-1 python3.9[163541]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:36 compute-1 sudo[163539]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:36 compute-1 sudo[163691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iolsipnpwfvjssnzsgionqsoodwxnlkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718776.311458-504-177227109104287/AnsiballZ_replace.py'
Dec 02 23:39:36 compute-1 sudo[163691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:37 compute-1 python3.9[163693]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:37 compute-1 sudo[163691]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:37 compute-1 sudo[163843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzxyiqjjbxhkntealuytzihitnrfimmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718777.3321311-519-26308909966453/AnsiballZ_replace.py'
Dec 02 23:39:37 compute-1 sudo[163843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:37 compute-1 python3.9[163845]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:37 compute-1 sudo[163843]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:38 compute-1 sudo[163995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdetuwyghwsumotadawnpadawtoawvht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718778.277484-537-170783870538868/AnsiballZ_lineinfile.py'
Dec 02 23:39:38 compute-1 sudo[163995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:38 compute-1 python3.9[163997]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:38 compute-1 sudo[163995]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:39 compute-1 sudo[164147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvpxyscpxlqrstnuyfopymwygrssbywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718779.0993001-537-270338286752411/AnsiballZ_lineinfile.py'
Dec 02 23:39:39 compute-1 sudo[164147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:39 compute-1 python3.9[164149]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:39 compute-1 sudo[164147]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:40 compute-1 sudo[164299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abohbfommumbvxmpcopfxceuojabtaub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718779.8608851-537-123675130551036/AnsiballZ_lineinfile.py'
Dec 02 23:39:40 compute-1 sudo[164299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:40 compute-1 python3.9[164301]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:40 compute-1 sudo[164299]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:41 compute-1 sudo[164451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnrobznaeoavnpducengokknmrrfcrga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718780.6534247-537-252345624505598/AnsiballZ_lineinfile.py'
Dec 02 23:39:41 compute-1 sudo[164451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:41 compute-1 python3.9[164453]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:41 compute-1 sudo[164451]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:41 compute-1 sudo[164603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntezktkclbkvhbumbajcbsqznskkufne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718781.5684698-595-89798227262837/AnsiballZ_stat.py'
Dec 02 23:39:41 compute-1 sudo[164603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:42 compute-1 python3.9[164605]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:42 compute-1 sudo[164603]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:42 compute-1 sudo[164757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiqusoabgkypicflhhtcjqcgjjkdwyhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718782.4595497-611-206483170281111/AnsiballZ_file.py'
Dec 02 23:39:42 compute-1 sudo[164757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:42 compute-1 python3.9[164759]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:42 compute-1 sudo[164757]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:43 compute-1 sudo[164909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfexavhkgzmvavujwtvabculvfeeesoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718783.3310802-629-259819817880075/AnsiballZ_file.py'
Dec 02 23:39:43 compute-1 sudo[164909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:43 compute-1 python3.9[164911]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:43 compute-1 sudo[164909]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:44 compute-1 sudo[165061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxtrgjzegeipjpjyrmcgbgrygcoqwvwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718784.1847107-645-114728345682413/AnsiballZ_stat.py'
Dec 02 23:39:44 compute-1 sudo[165061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:44 compute-1 python3.9[165063]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:44 compute-1 sudo[165061]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:45 compute-1 sudo[165139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocuemunfzqnihoiiyjfexwxjxjeufkuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718784.1847107-645-114728345682413/AnsiballZ_file.py'
Dec 02 23:39:45 compute-1 sudo[165139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:45 compute-1 python3.9[165141]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:45 compute-1 sudo[165139]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:45 compute-1 sudo[165291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyipvolhkwncmltqdckgtncibkpphgon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718785.4907787-645-134709549371997/AnsiballZ_stat.py'
Dec 02 23:39:45 compute-1 sudo[165291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:46 compute-1 python3.9[165293]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:46 compute-1 sudo[165291]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:46 compute-1 sudo[165369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmoqtkjvznwfytypvmijvhqbazbpkkyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718785.4907787-645-134709549371997/AnsiballZ_file.py'
Dec 02 23:39:46 compute-1 sudo[165369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:46 compute-1 python3.9[165371]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:46 compute-1 sudo[165369]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:47 compute-1 sudo[165521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kettclvsreqrdlgkupcwvskmbkwpvcmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718786.9955518-691-141553274744027/AnsiballZ_file.py'
Dec 02 23:39:47 compute-1 sudo[165521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:47 compute-1 python3.9[165523]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:47 compute-1 sudo[165521]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:48 compute-1 sudo[165673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odwmvbplzmwiehqnkgzwbzdhncvcewtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718787.8782332-707-38865659727470/AnsiballZ_stat.py'
Dec 02 23:39:48 compute-1 sudo[165673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:48 compute-1 python3.9[165675]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:48 compute-1 sudo[165673]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:48 compute-1 sudo[165751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oppmzmadqpddhbajkujjqxneciqenluq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718787.8782332-707-38865659727470/AnsiballZ_file.py'
Dec 02 23:39:48 compute-1 sudo[165751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:49 compute-1 python3.9[165753]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:49 compute-1 sudo[165751]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:49 compute-1 sudo[165903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kviurinfhteroqfpumhkilhhthdkdjko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718789.577745-732-210902900464211/AnsiballZ_stat.py'
Dec 02 23:39:49 compute-1 sudo[165903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:50 compute-1 python3.9[165905]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:50 compute-1 sudo[165903]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:50 compute-1 sudo[165981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asvmtieyfqomxpqgtmoqtnrsahgtnqcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718789.577745-732-210902900464211/AnsiballZ_file.py'
Dec 02 23:39:50 compute-1 sudo[165981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:50 compute-1 python3.9[165983]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:50 compute-1 sudo[165981]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:51 compute-1 sudo[166133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruwkoygpbmxckfrqltophdvyvdpdywfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718791.017933-755-193728697875821/AnsiballZ_systemd.py'
Dec 02 23:39:51 compute-1 sudo[166133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:51 compute-1 python3.9[166135]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:51 compute-1 systemd[1]: Reloading.
Dec 02 23:39:51 compute-1 systemd-rc-local-generator[166162]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:39:51 compute-1 systemd-sysv-generator[166168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:39:52 compute-1 sudo[166133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:52 compute-1 sudo[166322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxrnbdxebdqigrxnxtjdulccqwzwgblx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718792.4565785-771-255176910394417/AnsiballZ_stat.py'
Dec 02 23:39:52 compute-1 sudo[166322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:52 compute-1 python3.9[166324]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:53 compute-1 sudo[166322]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:53 compute-1 sudo[166400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jslwxtqqeohpedabfkqicwezfurqkumo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718792.4565785-771-255176910394417/AnsiballZ_file.py'
Dec 02 23:39:53 compute-1 sudo[166400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:53 compute-1 python3.9[166402]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:53 compute-1 sudo[166400]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:54 compute-1 sudo[166552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubgtaclkeuynrhfthvhurfjwurerdhgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718793.800797-795-51839580324939/AnsiballZ_stat.py'
Dec 02 23:39:54 compute-1 sudo[166552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:54 compute-1 python3.9[166554]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:54 compute-1 sudo[166552]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:54 compute-1 sudo[166630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdshdpqllrlqdifdrurtwfethjdawuzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718793.800797-795-51839580324939/AnsiballZ_file.py'
Dec 02 23:39:54 compute-1 sudo[166630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:55 compute-1 python3.9[166632]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:55 compute-1 sudo[166630]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:55 compute-1 sudo[166782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcbweiinfbbhcveelvupxnueejocydze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718795.3813567-819-119496375082262/AnsiballZ_systemd.py'
Dec 02 23:39:55 compute-1 sudo[166782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:56 compute-1 python3.9[166784]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:56 compute-1 systemd[1]: Reloading.
Dec 02 23:39:56 compute-1 systemd-rc-local-generator[166810]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:39:56 compute-1 systemd-sysv-generator[166814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:39:56 compute-1 systemd[1]: Starting Create netns directory...
Dec 02 23:39:56 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:39:56 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:39:56 compute-1 systemd[1]: Finished Create netns directory.
Dec 02 23:39:56 compute-1 sudo[166782]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:57 compute-1 sudo[166975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbskujyinucpwdrrfjzybcfrhhikgbqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718796.9901218-839-187077796333671/AnsiballZ_file.py'
Dec 02 23:39:57 compute-1 sudo[166975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:57 compute-1 python3.9[166977]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:57 compute-1 sudo[166975]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:58 compute-1 sudo[167127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhhbmotidkgpcpopzkuocuoffqueqodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718797.8839574-855-117270789301442/AnsiballZ_stat.py'
Dec 02 23:39:58 compute-1 sudo[167127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:58 compute-1 python3.9[167129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:58 compute-1 sudo[167127]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:59 compute-1 sudo[167250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czronszkcqfeaepddoifauelztdleobq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718797.8839574-855-117270789301442/AnsiballZ_copy.py'
Dec 02 23:39:59 compute-1 sudo[167250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:59 compute-1 python3.9[167252]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718797.8839574-855-117270789301442/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:59 compute-1 sudo[167250]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:00 compute-1 sudo[167402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hypzvhnzuvxccuungihcypwkyebzlqzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718799.9271939-890-88270270604066/AnsiballZ_file.py'
Dec 02 23:40:00 compute-1 sudo[167402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:00 compute-1 python3.9[167404]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:40:00 compute-1 sudo[167402]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:01 compute-1 sudo[167554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sptmvmvdwwbyuoiqlqaknwsglgaervhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718800.8257346-905-275739882929545/AnsiballZ_stat.py'
Dec 02 23:40:01 compute-1 sudo[167554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:01 compute-1 python3.9[167556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:40:01 compute-1 sudo[167554]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:40:01.665 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:40:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:40:01.667 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:40:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:40:01.667 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:40:02 compute-1 sudo[167678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytpsdcffhawlqvteqvyandhgdtrsepb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718800.8257346-905-275739882929545/AnsiballZ_copy.py'
Dec 02 23:40:02 compute-1 sudo[167678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:02 compute-1 podman[167680]: 2025-12-02 23:40:02.188411067 +0000 UTC m=+0.142898836 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 23:40:02 compute-1 python3.9[167681]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718800.8257346-905-275739882929545/.source.json _original_basename=.5nqbr_vy follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:02 compute-1 sudo[167678]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:02 compute-1 sudo[167857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmnbwscwtypxrgthqzrkqomffpjqsuex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718802.4815528-935-260586395037867/AnsiballZ_file.py'
Dec 02 23:40:02 compute-1 sudo[167857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:03 compute-1 python3.9[167859]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:03 compute-1 sudo[167857]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:03 compute-1 sudo[168010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-campjvmranguzfzhwdaynyvplcfyxffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718803.3635144-951-71379530015514/AnsiballZ_stat.py'
Dec 02 23:40:03 compute-1 sudo[168010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:04 compute-1 sudo[168010]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:04 compute-1 sudo[168133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fttmvzycxdqcpirhsvykggancfechwzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718803.3635144-951-71379530015514/AnsiballZ_copy.py'
Dec 02 23:40:04 compute-1 sudo[168133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:04 compute-1 sudo[168133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:05 compute-1 sudo[168300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aidrkmswxslmmbivrtsqjgjkxrgsvqkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718805.3287208-986-184381053201215/AnsiballZ_container_config_data.py'
Dec 02 23:40:05 compute-1 sudo[168300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:05 compute-1 podman[168259]: 2025-12-02 23:40:05.921664591 +0000 UTC m=+0.089666931 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 02 23:40:06 compute-1 python3.9[168306]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 02 23:40:06 compute-1 sudo[168300]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:07 compute-1 sudo[168456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqorskabrntasdflhbpdfqgyfaljxwnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718806.4922855-1003-85500193245569/AnsiballZ_container_config_hash.py'
Dec 02 23:40:07 compute-1 sudo[168456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:07 compute-1 python3.9[168458]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:40:07 compute-1 sudo[168456]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:08 compute-1 sudo[168608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzmsobgotgzaxgjvqrfffsecymeszpbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718807.6670165-1022-5368489566665/AnsiballZ_podman_container_info.py'
Dec 02 23:40:08 compute-1 sudo[168608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:08 compute-1 python3.9[168610]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 23:40:08 compute-1 sudo[168608]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:09 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 02 23:40:10 compute-1 sudo[168788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogxqftknrkekvqpctcgnxsipocjgqkrr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718809.459927-1047-183985652106058/AnsiballZ_edpm_container_manage.py'
Dec 02 23:40:10 compute-1 sudo[168788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:10 compute-1 python3[168790]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:40:10 compute-1 podman[168826]: 2025-12-02 23:40:10.648773003 +0000 UTC m=+0.080835561 container create 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd)
Dec 02 23:40:10 compute-1 podman[168826]: 2025-12-02 23:40:10.604107612 +0000 UTC m=+0.036170160 image pull 13a8acc03c3934b75192e1b3a8c127f56bf115253a854621e8e0e8b6330d5e9b 38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Dec 02 23:40:10 compute-1 python3[168790]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Dec 02 23:40:10 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 02 23:40:10 compute-1 sudo[168788]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:11 compute-1 sudo[169015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxrtmhxleigcpfiqfefeeunqgphaacap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718811.1349297-1063-263921918229384/AnsiballZ_stat.py'
Dec 02 23:40:11 compute-1 sudo[169015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:11 compute-1 python3.9[169017]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:40:11 compute-1 sudo[169015]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:12 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 02 23:40:12 compute-1 sudo[169170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esaisohslvwtbegontjicojnxbghetka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718812.1824782-1081-247978245643739/AnsiballZ_file.py'
Dec 02 23:40:12 compute-1 sudo[169170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:12 compute-1 python3.9[169172]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:12 compute-1 sudo[169170]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:13 compute-1 sudo[169246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heyfcqxstqmpntenmilgaleoqxxgfdtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718812.1824782-1081-247978245643739/AnsiballZ_stat.py'
Dec 02 23:40:13 compute-1 sudo[169246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:13 compute-1 python3.9[169248]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:40:13 compute-1 sudo[169246]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:13 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 02 23:40:14 compute-1 sudo[169398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wngzumpjiqhbgqjksbecyadodiewrrin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718813.44926-1081-15535517298038/AnsiballZ_copy.py'
Dec 02 23:40:14 compute-1 sudo[169398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:14 compute-1 python3.9[169400]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718813.44926-1081-15535517298038/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:14 compute-1 sudo[169398]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:14 compute-1 sudo[169474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofizsholpkdlbypfypnlldxxyrxorvff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718813.44926-1081-15535517298038/AnsiballZ_systemd.py'
Dec 02 23:40:14 compute-1 sudo[169474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:14 compute-1 python3.9[169476]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:40:15 compute-1 systemd[1]: Reloading.
Dec 02 23:40:15 compute-1 systemd-rc-local-generator[169503]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:15 compute-1 systemd-sysv-generator[169507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:15 compute-1 sudo[169474]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:15 compute-1 sudo[169585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcnbqbpxvclpsfpxdnchpgwftyntcnkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718813.44926-1081-15535517298038/AnsiballZ_systemd.py'
Dec 02 23:40:15 compute-1 sudo[169585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:16 compute-1 python3.9[169587]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:16 compute-1 systemd[1]: Reloading.
Dec 02 23:40:16 compute-1 systemd-rc-local-generator[169617]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:16 compute-1 systemd-sysv-generator[169621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:16 compute-1 systemd[1]: Starting multipathd container...
Dec 02 23:40:16 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b861e488871c8b17bcd68c187415264ca579bb269bbdf409b631cfe2d7383c23/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b861e488871c8b17bcd68c187415264ca579bb269bbdf409b631cfe2d7383c23/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:16 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.
Dec 02 23:40:16 compute-1 podman[169627]: 2025-12-02 23:40:16.764311132 +0000 UTC m=+0.180438537 container init 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 02 23:40:16 compute-1 multipathd[169643]: + sudo -E kolla_set_configs
Dec 02 23:40:16 compute-1 podman[169627]: 2025-12-02 23:40:16.809754662 +0000 UTC m=+0.225882057 container start 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 02 23:40:16 compute-1 podman[169627]: multipathd
Dec 02 23:40:16 compute-1 sudo[169650]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 23:40:16 compute-1 sudo[169650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:16 compute-1 systemd[1]: Started multipathd container.
Dec 02 23:40:16 compute-1 sudo[169585]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:16 compute-1 multipathd[169643]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:40:16 compute-1 multipathd[169643]: INFO:__main__:Validating config file
Dec 02 23:40:16 compute-1 multipathd[169643]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:40:16 compute-1 multipathd[169643]: INFO:__main__:Writing out command to execute
Dec 02 23:40:16 compute-1 sudo[169650]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:16 compute-1 multipathd[169643]: ++ cat /run_command
Dec 02 23:40:16 compute-1 multipathd[169643]: + CMD='/usr/sbin/multipathd -d'
Dec 02 23:40:16 compute-1 multipathd[169643]: + ARGS=
Dec 02 23:40:16 compute-1 multipathd[169643]: + sudo kolla_copy_cacerts
Dec 02 23:40:16 compute-1 podman[169649]: 2025-12-02 23:40:16.924400755 +0000 UTC m=+0.098296876 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd)
Dec 02 23:40:16 compute-1 sudo[169673]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 23:40:16 compute-1 sudo[169673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:16 compute-1 systemd[1]: 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7-64574aa783b5c59c.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:40:16 compute-1 systemd[1]: 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7-64574aa783b5c59c.service: Failed with result 'exit-code'.
Dec 02 23:40:16 compute-1 sudo[169673]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:16 compute-1 multipathd[169643]: + [[ ! -n '' ]]
Dec 02 23:40:16 compute-1 multipathd[169643]: + . kolla_extend_start
Dec 02 23:40:16 compute-1 multipathd[169643]: Running command: '/usr/sbin/multipathd -d'
Dec 02 23:40:16 compute-1 multipathd[169643]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 02 23:40:16 compute-1 multipathd[169643]: + umask 0022
Dec 02 23:40:16 compute-1 multipathd[169643]: + exec /usr/sbin/multipathd -d
Dec 02 23:40:16 compute-1 multipathd[169643]: 2836.684746 | multipathd v0.9.9: start up
Dec 02 23:40:16 compute-1 multipathd[169643]: 2836.698583 | reconfigure: setting up paths and maps
Dec 02 23:40:16 compute-1 multipathd[169643]: 2836.700856 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Dec 02 23:40:16 compute-1 multipathd[169643]: 2836.703131 | updated bindings file /etc/multipath/bindings
Dec 02 23:40:17 compute-1 python3.9[169832]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:40:18 compute-1 sudo[169984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaiuoxartsbhwidbehnfnfbtdlmbdhtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718817.9112117-1153-177405283343452/AnsiballZ_command.py'
Dec 02 23:40:18 compute-1 sudo[169984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:18 compute-1 python3.9[169986]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:40:18 compute-1 sudo[169984]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:19 compute-1 sudo[170148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blpjgjkniwtkbhfdbktbgoiwmesofwgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718818.8710454-1169-6038641859103/AnsiballZ_systemd.py'
Dec 02 23:40:19 compute-1 sudo[170148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:19 compute-1 python3.9[170150]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:40:19 compute-1 systemd[1]: Stopping multipathd container...
Dec 02 23:40:19 compute-1 multipathd[169643]: 2839.476468 | multipathd: shut down
Dec 02 23:40:19 compute-1 systemd[1]: libpod-135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.scope: Deactivated successfully.
Dec 02 23:40:19 compute-1 podman[170154]: 2025-12-02 23:40:19.805643921 +0000 UTC m=+0.112295908 container died 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:40:19 compute-1 systemd[1]: 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7-64574aa783b5c59c.timer: Deactivated successfully.
Dec 02 23:40:19 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.
Dec 02 23:40:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7-userdata-shm.mount: Deactivated successfully.
Dec 02 23:40:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-b861e488871c8b17bcd68c187415264ca579bb269bbdf409b631cfe2d7383c23-merged.mount: Deactivated successfully.
Dec 02 23:40:19 compute-1 podman[170154]: 2025-12-02 23:40:19.887752932 +0000 UTC m=+0.194404879 container cleanup 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible)
Dec 02 23:40:19 compute-1 podman[170154]: multipathd
Dec 02 23:40:19 compute-1 podman[170184]: multipathd
Dec 02 23:40:19 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 02 23:40:19 compute-1 systemd[1]: Stopped multipathd container.
Dec 02 23:40:19 compute-1 systemd[1]: Starting multipathd container...
Dec 02 23:40:20 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:40:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b861e488871c8b17bcd68c187415264ca579bb269bbdf409b631cfe2d7383c23/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b861e488871c8b17bcd68c187415264ca579bb269bbdf409b631cfe2d7383c23/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:20 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.
Dec 02 23:40:20 compute-1 podman[170197]: 2025-12-02 23:40:20.182315388 +0000 UTC m=+0.165948792 container init 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:40:20 compute-1 multipathd[170213]: + sudo -E kolla_set_configs
Dec 02 23:40:20 compute-1 sudo[170219]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 23:40:20 compute-1 sudo[170219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:20 compute-1 podman[170197]: 2025-12-02 23:40:20.220135627 +0000 UTC m=+0.203769021 container start 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:40:20 compute-1 podman[170197]: multipathd
Dec 02 23:40:20 compute-1 systemd[1]: Started multipathd container.
Dec 02 23:40:20 compute-1 sudo[170148]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:20 compute-1 multipathd[170213]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:40:20 compute-1 multipathd[170213]: INFO:__main__:Validating config file
Dec 02 23:40:20 compute-1 multipathd[170213]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:40:20 compute-1 multipathd[170213]: INFO:__main__:Writing out command to execute
Dec 02 23:40:20 compute-1 sudo[170219]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:20 compute-1 multipathd[170213]: ++ cat /run_command
Dec 02 23:40:20 compute-1 multipathd[170213]: + CMD='/usr/sbin/multipathd -d'
Dec 02 23:40:20 compute-1 multipathd[170213]: + ARGS=
Dec 02 23:40:20 compute-1 multipathd[170213]: + sudo kolla_copy_cacerts
Dec 02 23:40:20 compute-1 podman[170220]: 2025-12-02 23:40:20.313436133 +0000 UTC m=+0.078726261 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, tcib_build_tag=watcher_latest)
Dec 02 23:40:20 compute-1 systemd[1]: 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7-76487773cf7ead08.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:40:20 compute-1 systemd[1]: 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7-76487773cf7ead08.service: Failed with result 'exit-code'.
Dec 02 23:40:20 compute-1 sudo[170244]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 23:40:20 compute-1 sudo[170244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:20 compute-1 sudo[170244]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:20 compute-1 multipathd[170213]: Running command: '/usr/sbin/multipathd -d'
Dec 02 23:40:20 compute-1 multipathd[170213]: + [[ ! -n '' ]]
Dec 02 23:40:20 compute-1 multipathd[170213]: + . kolla_extend_start
Dec 02 23:40:20 compute-1 multipathd[170213]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 02 23:40:20 compute-1 multipathd[170213]: + umask 0022
Dec 02 23:40:20 compute-1 multipathd[170213]: + exec /usr/sbin/multipathd -d
Dec 02 23:40:20 compute-1 multipathd[170213]: 2840.064866 | multipathd v0.9.9: start up
Dec 02 23:40:20 compute-1 multipathd[170213]: 2840.073043 | reconfigure: setting up paths and maps
Dec 02 23:40:20 compute-1 sudo[170403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxwhnxixbuwnndpoqwgfttmxumpowupv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718820.5221672-1185-11110577049481/AnsiballZ_file.py'
Dec 02 23:40:20 compute-1 sudo[170403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:21 compute-1 python3.9[170405]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:21 compute-1 sudo[170403]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:22 compute-1 sudo[170555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orktdyiikiciaigzcegmeslnsgdwjqrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718821.6786022-1209-270919176490307/AnsiballZ_file.py'
Dec 02 23:40:22 compute-1 sudo[170555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:22 compute-1 python3.9[170557]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 23:40:22 compute-1 sudo[170555]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:22 compute-1 sudo[170707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eucebowepclrvjahefogztgryvstfckg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718822.5128658-1225-231556043383264/AnsiballZ_modprobe.py'
Dec 02 23:40:22 compute-1 sudo[170707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:23 compute-1 python3.9[170709]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 02 23:40:23 compute-1 kernel: Key type psk registered
Dec 02 23:40:23 compute-1 sudo[170707]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:23 compute-1 sudo[170869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cikmaubfvsagvhdlrtcedfaergyxmgrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718823.5515342-1241-104260760669644/AnsiballZ_stat.py'
Dec 02 23:40:23 compute-1 sudo[170869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:24 compute-1 python3.9[170871]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:40:24 compute-1 sudo[170869]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:24 compute-1 sudo[170992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyotayyasyovomhsspskzbsdwbjqipss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718823.5515342-1241-104260760669644/AnsiballZ_copy.py'
Dec 02 23:40:24 compute-1 sudo[170992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:24 compute-1 python3.9[170994]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718823.5515342-1241-104260760669644/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:24 compute-1 sudo[170992]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:25 compute-1 sudo[171144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkqpjbcetglfthzqafvgxmuhvajedjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718825.1844501-1273-203217724297965/AnsiballZ_lineinfile.py'
Dec 02 23:40:25 compute-1 sudo[171144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:25 compute-1 python3.9[171146]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:25 compute-1 sudo[171144]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:26 compute-1 sudo[171296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axelhpbiihlptecyzysqzvdggtaiionf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718826.1064997-1289-193488525408433/AnsiballZ_systemd.py'
Dec 02 23:40:26 compute-1 sudo[171296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:26 compute-1 python3.9[171298]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:40:27 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 23:40:27 compute-1 systemd[1]: Stopped Load Kernel Modules.
Dec 02 23:40:27 compute-1 systemd[1]: Stopping Load Kernel Modules...
Dec 02 23:40:27 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 02 23:40:27 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 02 23:40:27 compute-1 sudo[171296]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:28 compute-1 sudo[171452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkhmgwuhuraerrfvujzlwlxcwmukfesv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718828.3337524-1305-264975774461434/AnsiballZ_dnf.py'
Dec 02 23:40:28 compute-1 sudo[171452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:29 compute-1 python3.9[171454]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:40:31 compute-1 systemd[1]: Reloading.
Dec 02 23:40:31 compute-1 systemd-rc-local-generator[171482]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:31 compute-1 systemd-sysv-generator[171488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:31 compute-1 systemd[1]: Reloading.
Dec 02 23:40:31 compute-1 systemd-sysv-generator[171524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:31 compute-1 systemd-rc-local-generator[171518]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:32 compute-1 systemd-logind[790]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 02 23:40:32 compute-1 systemd-logind[790]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 02 23:40:32 compute-1 podman[171567]: 2025-12-02 23:40:32.420819614 +0000 UTC m=+0.139591077 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Dec 02 23:40:32 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:40:32 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:40:32 compute-1 systemd[1]: Reloading.
Dec 02 23:40:32 compute-1 systemd-sysv-generator[171643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:32 compute-1 systemd-rc-local-generator[171640]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:32 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:40:33 compute-1 sudo[171452]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:33 compute-1 sudo[172785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvbfedyssiwjevdjlkfuqkvbkaojhjkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718833.538982-1321-28532451907809/AnsiballZ_systemd_service.py'
Dec 02 23:40:33 compute-1 sudo[172785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:34 compute-1 python3.9[172814]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:40:34 compute-1 systemd[1]: Stopping Open-iSCSI...
Dec 02 23:40:34 compute-1 iscsid[161267]: iscsid shutting down.
Dec 02 23:40:34 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Dec 02 23:40:34 compute-1 systemd[1]: Stopped Open-iSCSI.
Dec 02 23:40:34 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 02 23:40:34 compute-1 systemd[1]: Starting Open-iSCSI...
Dec 02 23:40:34 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:40:34 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:40:34 compute-1 systemd[1]: man-db-cache-update.service: Consumed 2.253s CPU time.
Dec 02 23:40:34 compute-1 systemd[1]: Started Open-iSCSI.
Dec 02 23:40:34 compute-1 systemd[1]: run-r9ed9a4339dc247aab79b969ae67cf351.service: Deactivated successfully.
Dec 02 23:40:34 compute-1 sudo[172785]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:35 compute-1 python3.9[173086]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:40:36 compute-1 podman[173167]: 2025-12-02 23:40:36.250907999 +0000 UTC m=+0.075257619 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 23:40:36 compute-1 sudo[173257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arnqtrzdybbtizaoxfmujcssriicrshk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718836.055098-1357-205070411584214/AnsiballZ_file.py'
Dec 02 23:40:36 compute-1 sudo[173257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:36 compute-1 python3.9[173259]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:36 compute-1 sudo[173257]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:37 compute-1 sudo[173409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxlztvhyjsqzcvpikynpdykqyvqjjjni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718837.1922548-1378-156748401050747/AnsiballZ_systemd_service.py'
Dec 02 23:40:37 compute-1 sudo[173409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:37 compute-1 python3.9[173411]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:40:37 compute-1 systemd[1]: Reloading.
Dec 02 23:40:38 compute-1 systemd-rc-local-generator[173434]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:38 compute-1 systemd-sysv-generator[173441]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:38 compute-1 sudo[173409]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:39 compute-1 python3.9[173595]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:40:39 compute-1 network[173612]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:40:39 compute-1 network[173613]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:40:39 compute-1 network[173614]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:40:48 compute-1 sudo[173886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emmmuzrjqbakfuqgmpeyikowtpulscna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718847.6376448-1416-249730839321590/AnsiballZ_systemd_service.py'
Dec 02 23:40:48 compute-1 sudo[173886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:48 compute-1 python3.9[173888]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:48 compute-1 sudo[173886]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:49 compute-1 sudo[174039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtwtfxmkupkeqwpxxlwpdyoezrpcqluq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718848.8939471-1416-262588084664656/AnsiballZ_systemd_service.py'
Dec 02 23:40:49 compute-1 sudo[174039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:49 compute-1 python3.9[174041]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:49 compute-1 sudo[174039]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:50 compute-1 sudo[174192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nekpnudlrfadzwiasnsobqhkzrpeaioj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718849.8100526-1416-179988623142731/AnsiballZ_systemd_service.py'
Dec 02 23:40:50 compute-1 sudo[174192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:50 compute-1 python3.9[174194]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:50 compute-1 sudo[174192]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:50 compute-1 podman[174196]: 2025-12-02 23:40:50.686281455 +0000 UTC m=+0.107460273 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 02 23:40:51 compute-1 sudo[174366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbyztbvmtbamleygwbnjrwzsdsczjdls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718850.7749295-1416-257134321370159/AnsiballZ_systemd_service.py'
Dec 02 23:40:51 compute-1 sudo[174366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:51 compute-1 python3.9[174368]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:51 compute-1 sudo[174366]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:52 compute-1 sudo[174519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jenxrhampncskvytagoxfkjxoypzykch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718851.6957698-1416-269061379589143/AnsiballZ_systemd_service.py'
Dec 02 23:40:52 compute-1 sudo[174519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:52 compute-1 python3.9[174521]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:52 compute-1 sudo[174519]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:52 compute-1 sudo[174672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqhjzltmvkwbuwraumirpjtzthtpxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718852.587007-1416-115792668100952/AnsiballZ_systemd_service.py'
Dec 02 23:40:52 compute-1 sudo[174672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:53 compute-1 python3.9[174674]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:53 compute-1 sudo[174672]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:53 compute-1 sudo[174825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tstxhekngxnatxizqanifpjtylacvqjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718853.5429645-1416-145600010559561/AnsiballZ_systemd_service.py'
Dec 02 23:40:53 compute-1 sudo[174825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:54 compute-1 python3.9[174827]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:54 compute-1 sudo[174825]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:54 compute-1 sudo[174978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfbpymlkkdzosbzfyyipzhzqolbawgev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718854.494087-1416-185534186286675/AnsiballZ_systemd_service.py'
Dec 02 23:40:54 compute-1 sudo[174978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:55 compute-1 python3.9[174980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:55 compute-1 sudo[174978]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:56 compute-1 sudo[175131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rplaemfshlyzycgzyhibkahodojloqxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718855.8051727-1534-193073936126523/AnsiballZ_file.py'
Dec 02 23:40:56 compute-1 sudo[175131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:56 compute-1 python3.9[175133]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:56 compute-1 sudo[175131]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:57 compute-1 sudo[175283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xavnxsupeovtcsjavfrxucwciuhuamsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718856.6322608-1534-110578374191889/AnsiballZ_file.py'
Dec 02 23:40:57 compute-1 sudo[175283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:57 compute-1 python3.9[175285]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:57 compute-1 sudo[175283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:57 compute-1 sudo[175435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uathjagbjbqgogwrzuikgjreebfxylxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718857.4197624-1534-57980821916313/AnsiballZ_file.py'
Dec 02 23:40:57 compute-1 sudo[175435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:57 compute-1 python3.9[175437]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:57 compute-1 sudo[175435]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:58 compute-1 sudo[175587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kirtxtghgbqsfcidgtubyjddomqcjilh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718858.0490448-1534-14433374059755/AnsiballZ_file.py'
Dec 02 23:40:58 compute-1 sudo[175587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:58 compute-1 python3.9[175589]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:58 compute-1 sudo[175587]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:59 compute-1 sudo[175739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nehbayyxqbpvyonxeuompmyaruzcdhiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718858.8674867-1534-11555093270221/AnsiballZ_file.py'
Dec 02 23:40:59 compute-1 sudo[175739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:59 compute-1 python3.9[175741]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:59 compute-1 sudo[175739]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:00 compute-1 sudo[175891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmnirgxwyoylpbitukvaobefolitxdpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718859.7241929-1534-151040000104146/AnsiballZ_file.py'
Dec 02 23:41:00 compute-1 sudo[175891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:00 compute-1 python3.9[175893]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:00 compute-1 sudo[175891]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:00 compute-1 sudo[176043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhndruszwmxvjxdpiypkuxocbhdlpftj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718860.5450492-1534-141756554083998/AnsiballZ_file.py'
Dec 02 23:41:00 compute-1 sudo[176043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:01 compute-1 python3.9[176045]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:01 compute-1 sudo[176043]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:41:01.668 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:41:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:41:01.669 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:41:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:41:01.669 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:41:01 compute-1 sudo[176196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkwggwqpixkzyrxvftzqahpqwxpqqizf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718861.4010189-1534-129020767961677/AnsiballZ_file.py'
Dec 02 23:41:01 compute-1 sudo[176196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:01 compute-1 python3.9[176198]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:01 compute-1 sudo[176196]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:02 compute-1 sudo[176357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyhagndqgrqxbnurjylszevlumgcdugd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718862.2195938-1648-133149764525626/AnsiballZ_file.py'
Dec 02 23:41:02 compute-1 sudo[176357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:02 compute-1 podman[176322]: 2025-12-02 23:41:02.657531503 +0000 UTC m=+0.154686036 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 02 23:41:02 compute-1 python3.9[176365]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:02 compute-1 sudo[176357]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:03 compute-1 sudo[176525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdtyuzszgszsdbjsjybmcnqpoeezxnzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718862.9550376-1648-114667807386498/AnsiballZ_file.py'
Dec 02 23:41:03 compute-1 sudo[176525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:03 compute-1 python3.9[176527]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:03 compute-1 sudo[176525]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:04 compute-1 sudo[176677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvjadlsweghtczewksqpuymhjpsougoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718863.7699053-1648-184891045235591/AnsiballZ_file.py'
Dec 02 23:41:04 compute-1 sudo[176677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:04 compute-1 python3.9[176679]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:04 compute-1 sudo[176677]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:05 compute-1 sudo[176829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huyvdxwzsrckcyftigixctiuimnautxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718864.5787835-1648-164645415617683/AnsiballZ_file.py'
Dec 02 23:41:05 compute-1 sudo[176829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:05 compute-1 python3.9[176831]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:05 compute-1 sudo[176829]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:05 compute-1 sudo[176981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzqzsvvgjkobnsxzgxtdloiwvwzrhpix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718865.446826-1648-13749843038874/AnsiballZ_file.py'
Dec 02 23:41:05 compute-1 sudo[176981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:06 compute-1 python3.9[176983]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:06 compute-1 sudo[176981]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:06 compute-1 podman[177107]: 2025-12-02 23:41:06.764548317 +0000 UTC m=+0.077466990 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:41:06 compute-1 sudo[177148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmswejumgjlzkodmdiloqkyxpwnchavr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718866.265899-1648-271846179389653/AnsiballZ_file.py'
Dec 02 23:41:06 compute-1 sudo[177148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:06 compute-1 python3.9[177152]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:07 compute-1 sudo[177148]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:07 compute-1 sudo[177302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkpipcgbxyhrcgehowljcgckrxjbbrut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718867.164239-1648-238739591876152/AnsiballZ_file.py'
Dec 02 23:41:07 compute-1 sudo[177302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:07 compute-1 python3.9[177304]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:07 compute-1 sudo[177302]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:08 compute-1 sudo[177454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fimzclrjaskapxyohljdxxnfnzkjdvku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718867.9192789-1648-393976001165/AnsiballZ_file.py'
Dec 02 23:41:08 compute-1 sudo[177454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:08 compute-1 python3.9[177456]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:08 compute-1 sudo[177454]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:09 compute-1 sudo[177606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnpksqefkhsnnknbybhjiycjyudnxanh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718868.9338868-1764-17414776999274/AnsiballZ_command.py'
Dec 02 23:41:09 compute-1 sudo[177606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:09 compute-1 python3.9[177608]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:09 compute-1 sudo[177606]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:10 compute-1 python3.9[177760]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:41:11 compute-1 sudo[177910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-undpqnaaoxlstfjiabqongzmzuxhsppa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718870.978783-1800-51601395729087/AnsiballZ_systemd_service.py'
Dec 02 23:41:11 compute-1 sudo[177910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:11 compute-1 python3.9[177912]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:41:11 compute-1 systemd[1]: Reloading.
Dec 02 23:41:11 compute-1 systemd-rc-local-generator[177940]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:41:11 compute-1 systemd-sysv-generator[177943]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:41:12 compute-1 sudo[177910]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:12 compute-1 sudo[178097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvulxexqqsuqzsvimtvgtrzbvvmxtfqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718872.3489006-1816-159836585986192/AnsiballZ_command.py'
Dec 02 23:41:12 compute-1 sudo[178097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:12 compute-1 python3.9[178099]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:12 compute-1 sudo[178097]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:13 compute-1 sudo[178250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olmurzaqpiqsmawdfsiypevmihwiwkks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718873.1266448-1816-266142796976619/AnsiballZ_command.py'
Dec 02 23:41:13 compute-1 sudo[178250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:13 compute-1 python3.9[178252]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:13 compute-1 sudo[178250]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:14 compute-1 sudo[178403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiqtbtjtuzewyivoqaqznxhgkwyufupb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718873.9882827-1816-142723688262297/AnsiballZ_command.py'
Dec 02 23:41:14 compute-1 sudo[178403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:14 compute-1 python3.9[178405]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:14 compute-1 sudo[178403]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:15 compute-1 sudo[178556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqybqaxlrlpzvqtwugdlaoewqimcndwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718874.8495657-1816-98341270563407/AnsiballZ_command.py'
Dec 02 23:41:15 compute-1 sudo[178556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:15 compute-1 python3.9[178558]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:15 compute-1 sudo[178556]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:16 compute-1 sudo[178709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liogwuryqjyvjhqmegbpfsresxmmosjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718875.6557882-1816-128157290896106/AnsiballZ_command.py'
Dec 02 23:41:16 compute-1 sudo[178709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:16 compute-1 python3.9[178711]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:16 compute-1 sudo[178709]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:16 compute-1 sudo[178862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unogtlvarqdnnwrcaxjhqdlmzxwmzipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718876.4817975-1816-95756493227694/AnsiballZ_command.py'
Dec 02 23:41:16 compute-1 sudo[178862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:17 compute-1 python3.9[178864]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:17 compute-1 sudo[178862]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:17 compute-1 sudo[179015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsobaivusjvrajsnrhkffospehljtnqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718877.429532-1816-191623800540012/AnsiballZ_command.py'
Dec 02 23:41:17 compute-1 sudo[179015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:17 compute-1 python3.9[179017]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:17 compute-1 sudo[179015]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:18 compute-1 sudo[179168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprcjwowmtolokxuxnrkbvpisqpfgebi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718878.1396008-1816-172107624786850/AnsiballZ_command.py'
Dec 02 23:41:18 compute-1 sudo[179168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:18 compute-1 python3.9[179170]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:18 compute-1 sudo[179168]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:20 compute-1 sudo[179321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svndwnsaysobgbiskigbmkwvpclcyhoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718879.9903991-1959-274883071429184/AnsiballZ_file.py'
Dec 02 23:41:20 compute-1 sudo[179321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:20 compute-1 python3.9[179323]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:20 compute-1 sudo[179321]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:21 compute-1 podman[179424]: 2025-12-02 23:41:21.271824714 +0000 UTC m=+0.103925094 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:41:21 compute-1 sudo[179494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbbbkojzdhtpvsikdmexdrhggcctokr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718880.901554-1959-174887484877466/AnsiballZ_file.py'
Dec 02 23:41:21 compute-1 sudo[179494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:21 compute-1 python3.9[179496]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:21 compute-1 sudo[179494]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:22 compute-1 sudo[179646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbknnmntlkpyuaricpgdrmjvflmuxkae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718881.8766148-1959-249069973300430/AnsiballZ_file.py'
Dec 02 23:41:22 compute-1 sudo[179646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:22 compute-1 python3.9[179648]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:22 compute-1 sudo[179646]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:23 compute-1 sudo[179798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdxiqkwtfovxqqixspbywodrgpfumvjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718882.7251284-2003-120707766099895/AnsiballZ_file.py'
Dec 02 23:41:23 compute-1 sudo[179798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:23 compute-1 python3.9[179800]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:23 compute-1 sudo[179798]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:23 compute-1 sudo[179950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jobyqcazqpgjwiwddggjpmmetsmjljir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718883.5310745-2003-187817507542344/AnsiballZ_file.py'
Dec 02 23:41:23 compute-1 sudo[179950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:24 compute-1 python3.9[179952]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:24 compute-1 sudo[179950]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:24 compute-1 sudo[180102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzsegqilizgbfgqhprychiiqzforkvmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718884.397527-2003-40825473449347/AnsiballZ_file.py'
Dec 02 23:41:24 compute-1 sudo[180102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:25 compute-1 python3.9[180104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:25 compute-1 sudo[180102]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:25 compute-1 sudo[180254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yasgpyqlzvozxopzllyxzletvzccohsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718885.2690835-2003-164482288970972/AnsiballZ_file.py'
Dec 02 23:41:25 compute-1 sudo[180254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:25 compute-1 python3.9[180256]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:25 compute-1 sudo[180254]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:26 compute-1 sudo[180406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoqbyrujjepgaazptnycnrukadneydqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718886.1002314-2003-131644326679187/AnsiballZ_file.py'
Dec 02 23:41:26 compute-1 sudo[180406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:26 compute-1 python3.9[180408]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:26 compute-1 sudo[180406]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:27 compute-1 sudo[180558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixkektjkkyqeakbozfpabxczcnhsdnko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718887.0076275-2003-207228064410475/AnsiballZ_file.py'
Dec 02 23:41:27 compute-1 sudo[180558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:27 compute-1 python3.9[180560]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:27 compute-1 sudo[180558]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:28 compute-1 sudo[180710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvfgcppvmcqmwdmumrxxkctahjxhuscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718887.7882144-2003-161131101853288/AnsiballZ_file.py'
Dec 02 23:41:28 compute-1 sudo[180710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:28 compute-1 python3.9[180712]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:28 compute-1 sudo[180710]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:33 compute-1 podman[180788]: 2025-12-02 23:41:33.267509072 +0000 UTC m=+0.108127566 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Dec 02 23:41:33 compute-1 sudo[180889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iccixkvlyaesjlsylnkhvwekbwqcprjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718893.0684774-2240-98540901089814/AnsiballZ_getent.py'
Dec 02 23:41:33 compute-1 sudo[180889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:33 compute-1 python3.9[180891]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 02 23:41:33 compute-1 sudo[180889]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:34 compute-1 sudo[181042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsltuplrbroncxwgqtkhcbgrqntrmdje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718893.9785802-2256-140360423296213/AnsiballZ_group.py'
Dec 02 23:41:34 compute-1 sudo[181042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:34 compute-1 python3.9[181044]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:41:34 compute-1 groupadd[181045]: group added to /etc/group: name=nova, GID=42436
Dec 02 23:41:34 compute-1 groupadd[181045]: group added to /etc/gshadow: name=nova
Dec 02 23:41:34 compute-1 groupadd[181045]: new group: name=nova, GID=42436
Dec 02 23:41:34 compute-1 sudo[181042]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:35 compute-1 sudo[181200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgrugbptcswiqsrvbxfexfyzhcjfaslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718895.071479-2272-122406860142081/AnsiballZ_user.py'
Dec 02 23:41:35 compute-1 sudo[181200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:35 compute-1 python3.9[181202]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:41:35 compute-1 useradd[181204]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 02 23:41:35 compute-1 useradd[181204]: add 'nova' to group 'libvirt'
Dec 02 23:41:35 compute-1 useradd[181204]: add 'nova' to shadow group 'libvirt'
Dec 02 23:41:36 compute-1 sudo[181200]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:37 compute-1 sshd-session[181235]: Accepted publickey for zuul from 192.168.122.30 port 53416 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:41:37 compute-1 systemd-logind[790]: New session 25 of user zuul.
Dec 02 23:41:37 compute-1 systemd[1]: Started Session 25 of User zuul.
Dec 02 23:41:37 compute-1 sshd-session[181235]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:41:37 compute-1 podman[181237]: 2025-12-02 23:41:37.204433871 +0000 UTC m=+0.061992069 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 02 23:41:37 compute-1 sshd-session[181251]: Received disconnect from 192.168.122.30 port 53416:11: disconnected by user
Dec 02 23:41:37 compute-1 sshd-session[181251]: Disconnected from user zuul 192.168.122.30 port 53416
Dec 02 23:41:37 compute-1 sshd-session[181235]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:41:37 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Dec 02 23:41:37 compute-1 systemd-logind[790]: Session 25 logged out. Waiting for processes to exit.
Dec 02 23:41:37 compute-1 systemd-logind[790]: Removed session 25.
Dec 02 23:41:38 compute-1 python3.9[181407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:38 compute-1 python3.9[181528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718897.5327399-2322-116148384089891/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:39 compute-1 python3.9[181678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:40 compute-1 python3.9[181754]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:41 compute-1 python3.9[181904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:41 compute-1 python3.9[182025]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718900.3787875-2322-193629457766455/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:42 compute-1 python3.9[182175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:43 compute-1 python3.9[182296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718901.9069195-2322-244293177477808/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:44 compute-1 python3.9[182446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:44 compute-1 python3.9[182567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718903.4718943-2322-95132167644426/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:45 compute-1 python3.9[182717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:46 compute-1 python3.9[182838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718904.9723284-2322-139701347918866/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:47 compute-1 sudo[182988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbwmmyefuuovdjqpnxtxhgvyngbmaovq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718906.7468793-2488-140247789111035/AnsiballZ_file.py'
Dec 02 23:41:47 compute-1 sudo[182988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:47 compute-1 python3.9[182990]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:47 compute-1 sudo[182988]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:48 compute-1 sudo[183140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnmdhoivtspahdgcfsflxmsnzvqxkdem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718907.6510804-2504-255870140528057/AnsiballZ_copy.py'
Dec 02 23:41:48 compute-1 sudo[183140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:48 compute-1 python3.9[183142]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:48 compute-1 sudo[183140]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:48 compute-1 sudo[183292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfbbckdfjzawxddeiwfvxjuraqgcjpjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718908.5599935-2520-204324521424361/AnsiballZ_stat.py'
Dec 02 23:41:48 compute-1 sudo[183292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:49 compute-1 python3.9[183294]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:41:49 compute-1 sudo[183292]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:49 compute-1 sudo[183444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ombvvnyykrzonyeffifezssccspnhlxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718909.5917952-2536-210672489705954/AnsiballZ_stat.py'
Dec 02 23:41:49 compute-1 sudo[183444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:50 compute-1 python3.9[183446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:50 compute-1 sudo[183444]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:50 compute-1 sudo[183567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmgyxwtjyztkqfjpfjalbyblruwvxnoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718909.5917952-2536-210672489705954/AnsiballZ_copy.py'
Dec 02 23:41:50 compute-1 sudo[183567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:50 compute-1 python3.9[183569]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764718909.5917952-2536-210672489705954/.source _original_basename=.ml_k_ti_ follow=False checksum=52910d2c39ea51e396c2b3a04369453a2c48f795 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 02 23:41:50 compute-1 sudo[183567]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:51 compute-1 podman[183695]: 2025-12-02 23:41:51.73506524 +0000 UTC m=+0.089211842 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4)
Dec 02 23:41:51 compute-1 python3.9[183732]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:41:52 compute-1 python3.9[183891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:53 compute-1 python3.9[184012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718912.1878047-2589-193642374570484/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=be63b1bdae8b60cf07c8ce2aab749fcc5ff45b00 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:54 compute-1 python3.9[184162]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:55 compute-1 python3.9[184283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718913.6878066-2619-5358274004434/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=b86e9600018c7097ad57dbba089fc76217333398 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:55 compute-1 sudo[184433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbfwqdpgxlbtouxhowecoyxoufzjxesg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718915.492063-2652-97078207610194/AnsiballZ_container_config_data.py'
Dec 02 23:41:55 compute-1 sudo[184433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:56 compute-1 python3.9[184435]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 02 23:41:56 compute-1 sudo[184433]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:57 compute-1 sudo[184585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibvxaiidtwvnrniuoqgbguadeswnkhvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718916.5705004-2670-277641885021831/AnsiballZ_container_config_hash.py'
Dec 02 23:41:57 compute-1 sudo[184585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:57 compute-1 python3.9[184587]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:41:57 compute-1 sudo[184585]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:58 compute-1 sudo[184737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miwzvefhpnvvrvrvzpsfohdmejbhpgos ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718917.6356962-2690-148837739863246/AnsiballZ_edpm_container_manage.py'
Dec 02 23:41:58 compute-1 sudo[184737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:58 compute-1 python3[184739]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:41:58 compute-1 podman[184775]: 2025-12-02 23:41:58.637447314 +0000 UTC m=+0.075203895 container create 131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:41:58 compute-1 podman[184775]: 2025-12-02 23:41:58.600489467 +0000 UTC m=+0.038246098 image pull 99c98706e6d475ab9a9b50baf3431e8745aac38f98f776ef6ab7d3c7a2811699 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Dec 02 23:41:58 compute-1 python3[184739]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 02 23:41:58 compute-1 sudo[184737]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:59 compute-1 sudo[184963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waolizpowyqffgozyzgbifwtypcwmmjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718919.1279008-2706-277632750463977/AnsiballZ_stat.py'
Dec 02 23:41:59 compute-1 sudo[184963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:59 compute-1 python3.9[184965]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:41:59 compute-1 sudo[184963]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:00 compute-1 sudo[185117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqmuaywsqbdkwdyramntungsugbcnklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718920.3861594-2730-120467179486235/AnsiballZ_container_config_data.py'
Dec 02 23:42:00 compute-1 sudo[185117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:01 compute-1 python3.9[185119]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 02 23:42:01 compute-1 sudo[185117]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:42:01.670 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:42:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:42:01.671 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:42:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:42:01.671 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:42:01 compute-1 sudo[185270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbghcitrfpgcvhzuaoynhyupanfelsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718921.3640063-2748-213464209662784/AnsiballZ_container_config_hash.py'
Dec 02 23:42:01 compute-1 sudo[185270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:02 compute-1 python3.9[185272]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:42:02 compute-1 sudo[185270]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:02 compute-1 sudo[185422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjtcdfiarktilmhjtqenkxlflvzxoqut ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718922.4324965-2768-72603690503168/AnsiballZ_edpm_container_manage.py'
Dec 02 23:42:02 compute-1 sudo[185422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:03 compute-1 python3[185424]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:42:03 compute-1 podman[185459]: 2025-12-02 23:42:03.292556359 +0000 UTC m=+0.076706862 container create a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 02 23:42:03 compute-1 podman[185459]: 2025-12-02 23:42:03.255027788 +0000 UTC m=+0.039178351 image pull 99c98706e6d475ab9a9b50baf3431e8745aac38f98f776ef6ab7d3c7a2811699 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Dec 02 23:42:03 compute-1 python3[185424]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Dec 02 23:42:03 compute-1 sudo[185422]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:04 compute-1 sudo[185665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbcijzvckjustyfvikjefcxdoppbmagz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718923.7514167-2784-112017621667228/AnsiballZ_stat.py'
Dec 02 23:42:04 compute-1 sudo[185665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:04 compute-1 podman[185621]: 2025-12-02 23:42:04.238704825 +0000 UTC m=+0.141181429 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 02 23:42:04 compute-1 python3.9[185671]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:04 compute-1 sudo[185665]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:05 compute-1 sudo[185828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuugjkfdjuxamsmftcycvkkgpvmstdwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718924.7981179-2802-119340502771034/AnsiballZ_file.py'
Dec 02 23:42:05 compute-1 sudo[185828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:05 compute-1 python3.9[185830]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:05 compute-1 sudo[185828]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:06 compute-1 sudo[185979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjlrlkglxqubskorowofcsktjftsyvki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718925.5720906-2802-225270439922313/AnsiballZ_copy.py'
Dec 02 23:42:06 compute-1 sudo[185979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:06 compute-1 python3.9[185981]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718925.5720906-2802-225270439922313/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:06 compute-1 sudo[185979]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:06 compute-1 sudo[186055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilvocgrfjjnvnnbkvrhmijphpspapqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718925.5720906-2802-225270439922313/AnsiballZ_systemd.py'
Dec 02 23:42:06 compute-1 sudo[186055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:07 compute-1 python3.9[186057]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:42:07 compute-1 systemd[1]: Reloading.
Dec 02 23:42:07 compute-1 systemd-rc-local-generator[186084]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:07 compute-1 systemd-sysv-generator[186087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:07 compute-1 sudo[186055]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:07 compute-1 podman[186093]: 2025-12-02 23:42:07.479811836 +0000 UTC m=+0.092556922 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:42:07 compute-1 sudo[186185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-busbkcowafzjwcdjpzbdskaijiufpokp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718925.5720906-2802-225270439922313/AnsiballZ_systemd.py'
Dec 02 23:42:07 compute-1 sudo[186185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:08 compute-1 python3.9[186187]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:42:08 compute-1 systemd[1]: Reloading.
Dec 02 23:42:08 compute-1 systemd-sysv-generator[186220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:08 compute-1 systemd-rc-local-generator[186216]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:08 compute-1 systemd[1]: Starting nova_compute container...
Dec 02 23:42:08 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:42:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-1 podman[186227]: 2025-12-02 23:42:08.646073776 +0000 UTC m=+0.141056130 container init a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 02 23:42:08 compute-1 podman[186227]: 2025-12-02 23:42:08.658017626 +0000 UTC m=+0.152999960 container start a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 02 23:42:08 compute-1 podman[186227]: nova_compute
Dec 02 23:42:08 compute-1 nova_compute[186243]: + sudo -E kolla_set_configs
Dec 02 23:42:08 compute-1 systemd[1]: Started nova_compute container.
Dec 02 23:42:08 compute-1 sudo[186185]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Validating config file
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying service configuration files
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Deleting /etc/ceph
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Creating directory /etc/ceph
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Writing out command to execute
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:08 compute-1 nova_compute[186243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:08 compute-1 nova_compute[186243]: ++ cat /run_command
Dec 02 23:42:08 compute-1 nova_compute[186243]: + CMD=nova-compute
Dec 02 23:42:08 compute-1 nova_compute[186243]: + ARGS=
Dec 02 23:42:08 compute-1 nova_compute[186243]: + sudo kolla_copy_cacerts
Dec 02 23:42:08 compute-1 nova_compute[186243]: + [[ ! -n '' ]]
Dec 02 23:42:08 compute-1 nova_compute[186243]: + . kolla_extend_start
Dec 02 23:42:08 compute-1 nova_compute[186243]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 23:42:08 compute-1 nova_compute[186243]: Running command: 'nova-compute'
Dec 02 23:42:08 compute-1 nova_compute[186243]: + umask 0022
Dec 02 23:42:08 compute-1 nova_compute[186243]: + exec nova-compute
Dec 02 23:42:09 compute-1 python3.9[186404]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.797 186247 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.797 186247 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.797 186247 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.797 186247 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.911 186247 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.942 186247 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.943 186247 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 02 23:42:10 compute-1 python3.9[186556]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.972 186247 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Dec 02 23:42:10 compute-1 nova_compute[186243]: 2025-12-02 23:42:10.973 186247 WARNING oslo_config.cfg [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Dec 02 23:42:11 compute-1 python3.9[186707]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:11 compute-1 nova_compute[186243]: 2025-12-02 23:42:11.944 186247 INFO nova.virt.driver [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.062 186247 INFO nova.compute.provider_config [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.569 186247 DEBUG oslo_concurrency.lockutils [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.570 186247 DEBUG oslo_concurrency.lockutils [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.570 186247 DEBUG oslo_concurrency.lockutils [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.570 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.570 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.570 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.570 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.571 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.571 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.571 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.571 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.571 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.571 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.572 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.573 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.574 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.575 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.576 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.577 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.577 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.577 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.577 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.577 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.577 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.578 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.579 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.580 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.581 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.582 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.583 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.584 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.585 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.586 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.587 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.588 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.589 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.590 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.591 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.592 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.593 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.594 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.595 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.596 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.597 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.598 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.599 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.599 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.599 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.599 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.599 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.599 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.600 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.601 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.602 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.602 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.602 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.602 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.602 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.602 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.603 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.603 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.603 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.603 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.603 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.603 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.604 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.604 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.604 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.604 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.604 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.604 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.605 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.606 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.607 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.607 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.607 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.607 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.607 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.607 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.608 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.609 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.610 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.611 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.612 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.613 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.613 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.613 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.613 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.613 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.614 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.614 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.615 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.616 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.617 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.617 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.617 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.617 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.617 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.617 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.618 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.618 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.618 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.618 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.618 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.618 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.619 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.620 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.620 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.620 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.620 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.620 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.620 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.621 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.621 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.621 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.621 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.621 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.622 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.622 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.622 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.623 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.623 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.623 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.624 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.624 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.624 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.624 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.624 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.624 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.625 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.625 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.625 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.625 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.625 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.625 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.626 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.627 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.628 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.629 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.630 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.631 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.632 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.633 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.634 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.635 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.636 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.637 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.638 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.639 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.640 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.640 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.640 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.640 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.640 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.640 186247 WARNING oslo_config.cfg [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 23:42:12 compute-1 nova_compute[186243]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 23:42:12 compute-1 nova_compute[186243]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 23:42:12 compute-1 nova_compute[186243]: and ``live_migration_inbound_addr`` respectively.
Dec 02 23:42:12 compute-1 nova_compute[186243]: ).  Its value may be silently ignored in the future.
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.641 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.641 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.641 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.641 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.641 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.641 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.642 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.643 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.644 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.645 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.646 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.647 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.648 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.649 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.650 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.651 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.652 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.653 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.654 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.655 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.656 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.657 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.658 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.659 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.660 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.661 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.662 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.663 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.664 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.665 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.666 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.667 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.668 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.669 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.670 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.671 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.672 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.673 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.674 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.675 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.676 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.677 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.678 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.679 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.680 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.681 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.682 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.683 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.684 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.685 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.686 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.687 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.688 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.689 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.690 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.691 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.692 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.693 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.693 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.693 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.693 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.693 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.693 186247 DEBUG oslo_service.backend._eventlet.service [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 02 23:42:12 compute-1 nova_compute[186243]: 2025-12-02 23:42:12.694 186247 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Dec 02 23:42:12 compute-1 sudo[186859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ishhrporzjeigicahalerzgnkfesrhri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718932.377208-2922-49068737219190/AnsiballZ_podman_container.py'
Dec 02 23:42:12 compute-1 sudo[186859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.199 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Dec 02 23:42:13 compute-1 python3.9[186861]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 23:42:13 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Dec 02 23:42:13 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:13 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:13 compute-1 systemd[1]: Started libvirt QEMU daemon.
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.288 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fbfa0dc5400> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Dec 02 23:42:13 compute-1 nova_compute[186243]: libvirt:  error : internal error: could not initialize domain event timer
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.290 186247 WARNING nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.290 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fbfa0dc5400> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.293 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.294 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.294 186247 INFO nova.utils [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] The default thread pool MainProcess.default is initialized
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.295 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.295 186247 INFO nova.virt.libvirt.driver [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Connection event '1' reason 'None'
Dec 02 23:42:13 compute-1 sudo[186859]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.803 186247 WARNING nova.virt.libvirt.driver [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 02 23:42:13 compute-1 nova_compute[186243]: 2025-12-02 23:42:13.803 186247 DEBUG nova.virt.libvirt.volume.mount [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 23:42:14 compute-1 sudo[187093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nutrzrdhlwqjtrozyywknncicwqopzdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718933.629524-2938-193565197092757/AnsiballZ_systemd.py'
Dec 02 23:42:14 compute-1 sudo[187093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.318 186247 INFO nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]: 
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <host>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <uuid>8b5693ff-2e25-45a5-bebe-492dc3141f79</uuid>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <arch>x86_64</arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <microcode version='16777317'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <signature family='23' model='49' stepping='0'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='x2apic'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='tsc-deadline'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='osxsave'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='hypervisor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='tsc_adjust'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='spec-ctrl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='stibp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='arch-capabilities'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='cmp_legacy'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='topoext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='virt-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='lbrv'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='tsc-scale'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='vmcb-clean'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='pause-filter'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='pfthreshold'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='svme-addr-chk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='rdctl-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='mds-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature name='pschange-mc-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <pages unit='KiB' size='4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <pages unit='KiB' size='2048'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <pages unit='KiB' size='1048576'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <power_management>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <suspend_mem/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <suspend_disk/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <suspend_hybrid/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </power_management>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <iommu support='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <migration_features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <live/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <uri_transports>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <uri_transport>tcp</uri_transport>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <uri_transport>rdma</uri_transport>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </uri_transports>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </migration_features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <topology>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <cells num='1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <cell id='0'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           <memory unit='KiB'>7864312</memory>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           <pages unit='KiB' size='4'>1966078</pages>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           <distances>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <sibling id='0' value='10'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           </distances>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           <cpus num='8'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:           </cpus>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         </cell>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </cells>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </topology>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <cache>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </cache>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <secmodel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model>selinux</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <doi>0</doi>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </secmodel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <secmodel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model>dac</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <doi>0</doi>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </secmodel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </host>
Dec 02 23:42:14 compute-1 nova_compute[186243]: 
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <guest>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <os_type>hvm</os_type>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <arch name='i686'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <wordsize>32</wordsize>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <domain type='qemu'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <domain type='kvm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <pae/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <nonpae/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <apic default='on' toggle='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <cpuselection/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <deviceboot/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <externalSnapshot/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </guest>
Dec 02 23:42:14 compute-1 nova_compute[186243]: 
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <guest>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <os_type>hvm</os_type>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <arch name='x86_64'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <wordsize>64</wordsize>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <domain type='qemu'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <domain type='kvm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <apic default='on' toggle='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <cpuselection/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <deviceboot/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <externalSnapshot/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </guest>
Dec 02 23:42:14 compute-1 nova_compute[186243]: 
Dec 02 23:42:14 compute-1 nova_compute[186243]: </capabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]: 
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.327 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.359 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 23:42:14 compute-1 nova_compute[186243]: <domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <arch>i686</arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <vcpu max='240'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <os supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='firmware'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <loader supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>rom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pflash</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='readonly'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>yes</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='secure'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </loader>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </os>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 python3.9[187095]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>anonymous</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>memfd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </memoryBacking>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <disk supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>disk</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cdrom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>floppy</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>lun</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ide</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>fdc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>sata</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </disk>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vnc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </graphics>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <video supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='modelType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vga</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cirrus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>none</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>bochs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ramfb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </video>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='mode'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>subsystem</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>mandatory</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>requisite</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>optional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pci</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hostdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <rng supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>random</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </rng>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='driverType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>path</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>handle</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </filesystem>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emulator</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>external</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>2.0</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </tpm>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </redirdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <channel supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </channel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </crypto>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <interface supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>passt</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </interface>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <panic supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>isa</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>hyperv</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </panic>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <console supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>null</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dev</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pipe</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stdio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>udp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tcp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </console>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <gic supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sev supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='features'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>relaxed</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vapic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vpindex</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>runtime</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>synic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stimer</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reset</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>frequencies</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ipi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>avic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hyperv>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='sectype'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tdx</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </launchSecurity>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </features>
Dec 02 23:42:14 compute-1 nova_compute[186243]: </domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.370 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 23:42:14 compute-1 nova_compute[186243]: <domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <arch>i686</arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <vcpu max='4096'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <os supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='firmware'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <loader supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>rom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pflash</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='readonly'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>yes</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='secure'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </loader>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </os>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 systemd[1]: Stopping nova_compute container...
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>anonymous</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>memfd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </memoryBacking>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <disk supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>disk</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cdrom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>floppy</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>lun</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>fdc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>sata</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </disk>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vnc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </graphics>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <video supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='modelType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vga</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cirrus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>none</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>bochs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ramfb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </video>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='mode'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>subsystem</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>mandatory</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>requisite</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>optional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pci</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hostdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <rng supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>random</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </rng>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='driverType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>path</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>handle</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </filesystem>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emulator</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>external</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>2.0</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </tpm>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </redirdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <channel supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </channel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </crypto>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <interface supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>passt</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </interface>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <panic supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>isa</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>hyperv</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </panic>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <console supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>null</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dev</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pipe</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stdio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>udp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tcp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </console>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <gic supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sev supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='features'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>relaxed</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vapic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vpindex</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>runtime</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>synic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stimer</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reset</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>frequencies</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ipi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>avic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hyperv>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='sectype'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tdx</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </launchSecurity>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </features>
Dec 02 23:42:14 compute-1 nova_compute[186243]: </domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.424 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.429 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 23:42:14 compute-1 nova_compute[186243]: <domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <arch>x86_64</arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <vcpu max='240'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <os supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='firmware'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <loader supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>rom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pflash</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='readonly'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>yes</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='secure'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </loader>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </os>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>anonymous</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>memfd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </memoryBacking>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <disk supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>disk</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cdrom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>floppy</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>lun</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ide</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>fdc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>sata</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </disk>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vnc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </graphics>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <video supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='modelType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vga</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cirrus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>none</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>bochs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ramfb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </video>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='mode'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>subsystem</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>mandatory</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>requisite</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>optional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pci</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hostdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <rng supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>random</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </rng>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='driverType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>path</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>handle</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </filesystem>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emulator</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>external</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>2.0</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </tpm>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </redirdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <channel supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </channel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </crypto>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <interface supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>passt</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </interface>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <panic supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>isa</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>hyperv</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </panic>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <console supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>null</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dev</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pipe</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stdio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>udp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tcp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </console>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <gic supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sev supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='features'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>relaxed</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vapic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vpindex</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>runtime</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>synic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stimer</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reset</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>frequencies</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ipi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>avic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hyperv>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='sectype'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tdx</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </launchSecurity>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </features>
Dec 02 23:42:14 compute-1 nova_compute[186243]: </domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.496 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 23:42:14 compute-1 nova_compute[186243]: <domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <arch>x86_64</arch>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <vcpu max='4096'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <os supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='firmware'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>efi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <loader supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>rom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pflash</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='readonly'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>yes</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='secure'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>yes</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>no</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </loader>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </os>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>on</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>off</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xop'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='la57'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='hle'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='pku'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='erms'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='ss'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </blockers>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </mode>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </cpu>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>anonymous</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <value>memfd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </memoryBacking>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <disk supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>disk</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cdrom</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>floppy</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>lun</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>fdc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>sata</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </disk>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vnc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </graphics>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <video supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='modelType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vga</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>cirrus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>none</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>bochs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ramfb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </video>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='mode'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>subsystem</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>mandatory</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>requisite</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>optional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pci</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>scsi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hostdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <rng supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>random</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>egd</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </rng>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='driverType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>path</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>handle</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </filesystem>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emulator</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>external</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>2.0</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </tpm>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='bus'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>usb</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </redirdev>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <channel supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </channel>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>builtin</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </crypto>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <interface supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='backendType'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>default</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>passt</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </interface>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <panic supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='model'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>isa</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>hyperv</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </panic>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <console supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='type'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>null</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vc</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pty</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dev</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>file</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>pipe</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stdio</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>udp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tcp</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>unix</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>dbus</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </console>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </devices>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   <features>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <gic supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sev supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='features'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>relaxed</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vapic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vpindex</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>runtime</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>synic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>stimer</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reset</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>frequencies</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>ipi</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>avic</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </defaults>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </hyperv>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       <enum name='sectype'>
Dec 02 23:42:14 compute-1 nova_compute[186243]:         <value>tdx</value>
Dec 02 23:42:14 compute-1 nova_compute[186243]:       </enum>
Dec 02 23:42:14 compute-1 nova_compute[186243]:     </launchSecurity>
Dec 02 23:42:14 compute-1 nova_compute[186243]:   </features>
Dec 02 23:42:14 compute-1 nova_compute[186243]: </domainCapabilities>
Dec 02 23:42:14 compute-1 nova_compute[186243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.596 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.596 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.596 186247 DEBUG nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.597 186247 INFO nova.virt.libvirt.host [None req-ceabdb07-6020-41ed-b8ea-5269f0600d23 - - - - - -] Secure Boot support detected
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.597 186247 DEBUG oslo_concurrency.lockutils [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.598 186247 DEBUG oslo_concurrency.lockutils [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:42:14 compute-1 nova_compute[186243]: 2025-12-02 23:42:14.598 186247 DEBUG oslo_concurrency.lockutils [None req-847b874b-91ab-4972-92d9-b990125a50ee - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:42:15 compute-1 virtqemud[186882]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 02 23:42:15 compute-1 virtqemud[186882]: hostname: compute-1
Dec 02 23:42:15 compute-1 virtqemud[186882]: End of file while reading data: Input/output error
Dec 02 23:42:15 compute-1 systemd[1]: libpod-a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5.scope: Deactivated successfully.
Dec 02 23:42:15 compute-1 systemd[1]: libpod-a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5.scope: Consumed 3.241s CPU time.
Dec 02 23:42:15 compute-1 podman[187103]: 2025-12-02 23:42:15.093491622 +0000 UTC m=+0.627651145 container died a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 02 23:42:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5-userdata-shm.mount: Deactivated successfully.
Dec 02 23:42:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289-merged.mount: Deactivated successfully.
Dec 02 23:42:15 compute-1 podman[187103]: 2025-12-02 23:42:15.162296816 +0000 UTC m=+0.696456339 container cleanup a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute)
Dec 02 23:42:15 compute-1 podman[187103]: nova_compute
Dec 02 23:42:15 compute-1 podman[187130]: nova_compute
Dec 02 23:42:15 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 02 23:42:15 compute-1 systemd[1]: Stopped nova_compute container.
Dec 02 23:42:15 compute-1 systemd[1]: Starting nova_compute container...
Dec 02 23:42:15 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:42:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba68b976086a9d2c79e5d99df0c664000417ec349e3d14ba9c9b475361ab289/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-1 podman[187143]: 2025-12-02 23:42:15.39702268 +0000 UTC m=+0.135184356 container init a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20251202, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 02 23:42:15 compute-1 podman[187143]: 2025-12-02 23:42:15.40979636 +0000 UTC m=+0.147958026 container start a9a478f6308593ca59cc89adc7b48f8897e9e7faeced9b2b1697c2b489ede6c5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:42:15 compute-1 podman[187143]: nova_compute
Dec 02 23:42:15 compute-1 nova_compute[187157]: + sudo -E kolla_set_configs
Dec 02 23:42:15 compute-1 systemd[1]: Started nova_compute container.
Dec 02 23:42:15 compute-1 sudo[187093]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Validating config file
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying service configuration files
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /etc/ceph
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Creating directory /etc/ceph
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Writing out command to execute
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-1 nova_compute[187157]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-1 nova_compute[187157]: ++ cat /run_command
Dec 02 23:42:15 compute-1 nova_compute[187157]: + CMD=nova-compute
Dec 02 23:42:15 compute-1 nova_compute[187157]: + ARGS=
Dec 02 23:42:15 compute-1 nova_compute[187157]: + sudo kolla_copy_cacerts
Dec 02 23:42:15 compute-1 nova_compute[187157]: + [[ ! -n '' ]]
Dec 02 23:42:15 compute-1 nova_compute[187157]: + . kolla_extend_start
Dec 02 23:42:15 compute-1 nova_compute[187157]: Running command: 'nova-compute'
Dec 02 23:42:15 compute-1 nova_compute[187157]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 23:42:15 compute-1 nova_compute[187157]: + umask 0022
Dec 02 23:42:15 compute-1 nova_compute[187157]: + exec nova-compute
Dec 02 23:42:16 compute-1 sudo[187318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkinlwuyswysymytiutdguohvqevydin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718935.8092296-2956-102116168921163/AnsiballZ_podman_container.py'
Dec 02 23:42:16 compute-1 sudo[187318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:16 compute-1 python3.9[187320]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 23:42:16 compute-1 systemd[1]: Started libpod-conmon-131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5.scope.
Dec 02 23:42:16 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:42:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05641c509511540fe3f45b2a52e58abe36c70c85e39a701a6b1b7a77e1e45f09/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05641c509511540fe3f45b2a52e58abe36c70c85e39a701a6b1b7a77e1e45f09/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05641c509511540fe3f45b2a52e58abe36c70c85e39a701a6b1b7a77e1e45f09/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:16 compute-1 podman[187345]: 2025-12-02 23:42:16.807653934 +0000 UTC m=+0.148891880 container init 131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:42:16 compute-1 podman[187345]: 2025-12-02 23:42:16.814615313 +0000 UTC m=+0.155853189 container start 131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, tcib_build_tag=watcher_latest, config_id=edpm, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:42:16 compute-1 python3.9[187320]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Applying nova statedir ownership
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 02 23:42:16 compute-1 nova_compute_init[187367]: INFO:nova_statedir:Nova statedir ownership complete
Dec 02 23:42:16 compute-1 systemd[1]: libpod-131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5.scope: Deactivated successfully.
Dec 02 23:42:16 compute-1 podman[187382]: 2025-12-02 23:42:16.918626231 +0000 UTC m=+0.021853713 container died 131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec 02 23:42:16 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5-userdata-shm.mount: Deactivated successfully.
Dec 02 23:42:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-05641c509511540fe3f45b2a52e58abe36c70c85e39a701a6b1b7a77e1e45f09-merged.mount: Deactivated successfully.
Dec 02 23:42:16 compute-1 podman[187382]: 2025-12-02 23:42:16.950581628 +0000 UTC m=+0.053809100 container cleanup 131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5 (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 23:42:16 compute-1 sudo[187318]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:16 compute-1 systemd[1]: libpod-conmon-131e7d11e6e4820be66346e295de3f7f3c73fd228ee77852264b768968996ee5.scope: Deactivated successfully.
Dec 02 23:42:17 compute-1 sshd-session[159002]: Connection closed by 192.168.122.30 port 46660
Dec 02 23:42:17 compute-1 sshd-session[158999]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:42:17 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Dec 02 23:42:17 compute-1 systemd[1]: session-24.scope: Consumed 2min 23.180s CPU time.
Dec 02 23:42:17 compute-1 systemd-logind[790]: Session 24 logged out. Waiting for processes to exit.
Dec 02 23:42:17 compute-1 systemd-logind[790]: Removed session 24.
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.534 187161 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.534 187161 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.534 187161 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.534 187161 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.650 187161 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.668 187161 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.669 187161 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.699 187161 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Dec 02 23:42:17 compute-1 nova_compute[187157]: 2025-12-02 23:42:17.700 187161 WARNING oslo_config.cfg [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Dec 02 23:42:18 compute-1 nova_compute[187157]: 2025-12-02 23:42:18.753 187161 INFO nova.virt.driver [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 23:42:18 compute-1 nova_compute[187157]: 2025-12-02 23:42:18.868 187161 INFO nova.compute.provider_config [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.376 187161 DEBUG oslo_concurrency.lockutils [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.376 187161 DEBUG oslo_concurrency.lockutils [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.377 187161 DEBUG oslo_concurrency.lockutils [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.377 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.378 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.378 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.378 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.379 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.379 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.379 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.379 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.380 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.380 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.380 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.381 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.381 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.381 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.381 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.382 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.382 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.382 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.382 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.383 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.383 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.383 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.383 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.384 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.384 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.384 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.385 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.385 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.385 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.385 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.386 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.386 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.386 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.386 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.387 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.387 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.387 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.387 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.388 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.388 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.388 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.389 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.389 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.389 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.389 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.390 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.390 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.390 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.390 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.391 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.391 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.391 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.392 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.392 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.392 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.392 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.393 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.393 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.393 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.393 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.394 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.394 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.394 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.394 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.395 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.395 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.395 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.395 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.396 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.396 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.396 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.396 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.397 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.397 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.397 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.397 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.398 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.398 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.398 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.398 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.399 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.399 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.399 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.399 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.400 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.400 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.400 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.401 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.401 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.401 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.401 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.402 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.402 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.402 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.402 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.403 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.403 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.403 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.403 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.404 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.404 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.404 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.404 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.405 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.405 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.405 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.405 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.406 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.406 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.406 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.406 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.407 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.407 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.407 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.407 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.408 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.408 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.408 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.408 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.409 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.409 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.409 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.409 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.410 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.410 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.410 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.411 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.412 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.412 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.412 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.412 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.412 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.412 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.413 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.413 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.413 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.413 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.413 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.413 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.414 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.414 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.414 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.414 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.414 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.414 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.415 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.416 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.417 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.418 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.418 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.418 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.418 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.418 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.418 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.419 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.419 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.419 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.419 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.419 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.420 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.421 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.421 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.421 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.421 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.421 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.421 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.422 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.422 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.422 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.422 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.422 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.422 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.423 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.424 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.424 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.424 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.424 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.424 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.424 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.425 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.425 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.425 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.425 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.425 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.425 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.426 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.427 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.427 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.427 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.427 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.427 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.427 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.428 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.428 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.428 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.428 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.428 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.428 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.429 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.429 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.429 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.429 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.429 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.431 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.431 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.431 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.431 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.431 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.431 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.432 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.432 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.432 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.432 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.432 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.432 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.433 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.433 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.433 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.433 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.433 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.433 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.434 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.434 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.434 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.434 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.434 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.434 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.435 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.435 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.435 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.435 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.435 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.435 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.436 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.436 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.436 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.436 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.436 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.436 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.437 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.438 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.438 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.438 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.438 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.438 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.438 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.439 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.439 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.439 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.439 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.439 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.439 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.440 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.441 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.441 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.441 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.441 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.441 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.441 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.442 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.442 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.442 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.442 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.442 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.442 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.443 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.444 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.444 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.444 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.444 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.444 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.444 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.445 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.445 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.445 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.445 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.445 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.445 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.446 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.446 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.446 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.446 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.446 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.446 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.447 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.448 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.448 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.448 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.448 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.448 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.448 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.449 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.449 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.449 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.449 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.449 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.449 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.450 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.451 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.451 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.451 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.451 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.451 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.452 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.452 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.452 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.452 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.452 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.453 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.454 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.454 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.454 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.454 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.454 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.454 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.455 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.456 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.456 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.456 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.456 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.456 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.456 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.457 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.458 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.459 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.460 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.461 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.462 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.463 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.464 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.465 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.466 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.467 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.468 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 WARNING oslo_config.cfg [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 23:42:19 compute-1 nova_compute[187157]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 23:42:19 compute-1 nova_compute[187157]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 23:42:19 compute-1 nova_compute[187157]: and ``live_migration_inbound_addr`` respectively.
Dec 02 23:42:19 compute-1 nova_compute[187157]: ).  Its value may be silently ignored in the future.
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.469 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.470 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.471 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.472 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.473 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.474 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.475 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.476 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.477 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.478 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.479 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.480 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.481 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.482 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.483 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.484 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.485 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.486 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.487 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.488 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.489 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.490 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.491 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.492 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.493 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.494 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.495 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.496 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.497 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.498 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.499 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.500 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.501 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.502 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.503 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.504 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.505 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.506 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.507 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.508 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.509 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.510 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.511 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.512 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.513 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.514 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.515 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.516 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.517 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.518 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.519 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.519 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.519 187161 DEBUG oslo_service.backend._eventlet.service [None req-507aebf2-c897-447d-804d-ef619af5dc49 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 02 23:42:19 compute-1 nova_compute[187157]: 2025-12-02 23:42:19.519 187161 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.026 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.041 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd31b263f80> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Dec 02 23:42:20 compute-1 nova_compute[187157]: libvirt:  error : internal error: could not initialize domain event timer
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.042 187161 WARNING nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.043 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd31b263f80> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.045 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.045 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.045 187161 INFO nova.utils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] The default thread pool MainProcess.default is initialized
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.046 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.046 187161 INFO nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Connection event '1' reason 'None'
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.059 187161 INFO nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]: 
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <host>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <uuid>8b5693ff-2e25-45a5-bebe-492dc3141f79</uuid>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <arch>x86_64</arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <microcode version='16777317'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <signature family='23' model='49' stepping='0'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='x2apic'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='tsc-deadline'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='osxsave'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='hypervisor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='tsc_adjust'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='spec-ctrl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='stibp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='arch-capabilities'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='cmp_legacy'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='topoext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='virt-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='lbrv'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='tsc-scale'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='vmcb-clean'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='pause-filter'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='pfthreshold'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='svme-addr-chk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='rdctl-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='mds-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature name='pschange-mc-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <pages unit='KiB' size='4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <pages unit='KiB' size='2048'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <pages unit='KiB' size='1048576'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <power_management>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <suspend_mem/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <suspend_disk/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <suspend_hybrid/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </power_management>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <iommu support='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <migration_features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <live/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <uri_transports>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <uri_transport>tcp</uri_transport>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <uri_transport>rdma</uri_transport>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </uri_transports>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </migration_features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <topology>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <cells num='1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <cell id='0'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           <memory unit='KiB'>7864312</memory>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           <pages unit='KiB' size='4'>1966078</pages>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           <distances>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <sibling id='0' value='10'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           </distances>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           <cpus num='8'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:           </cpus>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         </cell>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </cells>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </topology>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <cache>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </cache>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <secmodel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model>selinux</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <doi>0</doi>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </secmodel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <secmodel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model>dac</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <doi>0</doi>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </secmodel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </host>
Dec 02 23:42:20 compute-1 nova_compute[187157]: 
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <guest>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <os_type>hvm</os_type>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <arch name='i686'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <wordsize>32</wordsize>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <domain type='qemu'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <domain type='kvm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <pae/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <nonpae/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <apic default='on' toggle='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <cpuselection/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <deviceboot/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <externalSnapshot/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </guest>
Dec 02 23:42:20 compute-1 nova_compute[187157]: 
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <guest>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <os_type>hvm</os_type>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <arch name='x86_64'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <wordsize>64</wordsize>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <domain type='qemu'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <domain type='kvm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <apic default='on' toggle='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <cpuselection/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <deviceboot/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <externalSnapshot/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </guest>
Dec 02 23:42:20 compute-1 nova_compute[187157]: 
Dec 02 23:42:20 compute-1 nova_compute[187157]: </capabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]: 
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.067 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.072 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 23:42:20 compute-1 nova_compute[187157]: <domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <arch>i686</arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <vcpu max='240'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <os supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='firmware'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <loader supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>rom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pflash</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='readonly'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>yes</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='secure'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </loader>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </os>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>anonymous</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>memfd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </memoryBacking>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <disk supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>disk</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cdrom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>floppy</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>lun</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ide</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>fdc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>sata</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vnc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </graphics>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <video supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='modelType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vga</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cirrus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>none</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>bochs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ramfb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </video>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='mode'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>subsystem</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>mandatory</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>requisite</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>optional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pci</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hostdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <rng supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>random</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='driverType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>path</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>handle</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </filesystem>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emulator</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>external</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>2.0</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </tpm>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </redirdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <channel supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </channel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </crypto>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <interface supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>passt</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <panic supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>isa</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>hyperv</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </panic>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <console supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>null</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dev</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pipe</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stdio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>udp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tcp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </console>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <gic supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sev supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='features'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>relaxed</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vapic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vpindex</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>runtime</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>synic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stimer</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reset</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>frequencies</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ipi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>avic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hyperv>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='sectype'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tdx</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </launchSecurity>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </features>
Dec 02 23:42:20 compute-1 nova_compute[187157]: </domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.080 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 23:42:20 compute-1 nova_compute[187157]: <domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <arch>i686</arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <vcpu max='4096'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <os supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='firmware'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <loader supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>rom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pflash</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='readonly'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>yes</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='secure'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </loader>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </os>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>anonymous</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>memfd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </memoryBacking>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <disk supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>disk</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cdrom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>floppy</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>lun</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>fdc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>sata</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vnc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </graphics>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <video supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='modelType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vga</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cirrus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>none</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>bochs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ramfb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </video>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='mode'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>subsystem</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>mandatory</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>requisite</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>optional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pci</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hostdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <rng supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>random</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='driverType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>path</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>handle</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </filesystem>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emulator</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>external</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>2.0</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </tpm>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </redirdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <channel supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </channel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </crypto>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <interface supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>passt</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <panic supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>isa</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>hyperv</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </panic>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <console supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>null</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dev</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pipe</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stdio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>udp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tcp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </console>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <gic supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sev supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='features'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>relaxed</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vapic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vpindex</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>runtime</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>synic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stimer</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reset</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>frequencies</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ipi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>avic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hyperv>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='sectype'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tdx</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </launchSecurity>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </features>
Dec 02 23:42:20 compute-1 nova_compute[187157]: </domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.135 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.141 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 23:42:20 compute-1 nova_compute[187157]: <domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <arch>x86_64</arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <vcpu max='240'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <os supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='firmware'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <loader supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>rom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pflash</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='readonly'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>yes</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='secure'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </loader>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </os>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>anonymous</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>memfd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </memoryBacking>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <disk supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>disk</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cdrom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>floppy</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>lun</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ide</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>fdc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>sata</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vnc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </graphics>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <video supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='modelType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vga</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cirrus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>none</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>bochs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ramfb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </video>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='mode'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>subsystem</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>mandatory</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>requisite</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>optional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pci</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hostdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <rng supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>random</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='driverType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>path</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>handle</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </filesystem>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emulator</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>external</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>2.0</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </tpm>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </redirdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <channel supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </channel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </crypto>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <interface supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>passt</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <panic supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>isa</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>hyperv</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </panic>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <console supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>null</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dev</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pipe</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stdio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>udp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tcp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </console>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <gic supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sev supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='features'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>relaxed</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vapic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vpindex</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>runtime</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>synic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stimer</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reset</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>frequencies</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ipi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>avic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hyperv>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='sectype'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tdx</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </launchSecurity>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </features>
Dec 02 23:42:20 compute-1 nova_compute[187157]: </domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.199 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 23:42:20 compute-1 nova_compute[187157]: <domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <arch>x86_64</arch>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <vcpu max='4096'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <os supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='firmware'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>efi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <loader supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>rom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pflash</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='readonly'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>yes</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='secure'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>yes</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>no</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </loader>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </os>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>on</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>off</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xop'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='la57'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='hle'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='pku'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='erms'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='ss'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </blockers>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </mode>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>anonymous</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <value>memfd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </memoryBacking>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <disk supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>disk</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cdrom</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>floppy</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>lun</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>fdc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>sata</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vnc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </graphics>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <video supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='modelType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vga</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>cirrus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>none</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>bochs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ramfb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </video>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='mode'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>subsystem</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>mandatory</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>requisite</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>optional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pci</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>scsi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hostdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <rng supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>random</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>egd</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='driverType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>path</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>handle</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </filesystem>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emulator</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>external</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>2.0</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </tpm>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='bus'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>usb</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </redirdev>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <channel supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </channel>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>builtin</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </crypto>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <interface supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='backendType'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>default</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>passt</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <panic supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='model'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>isa</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>hyperv</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </panic>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <console supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='type'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>null</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vc</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pty</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dev</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>file</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>pipe</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stdio</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>udp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tcp</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>unix</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>dbus</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </console>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <features>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <gic supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sev supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='features'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>relaxed</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vapic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vpindex</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>runtime</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>synic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>stimer</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reset</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>frequencies</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>ipi</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>avic</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </defaults>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </hyperv>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       <enum name='sectype'>
Dec 02 23:42:20 compute-1 nova_compute[187157]:         <value>tdx</value>
Dec 02 23:42:20 compute-1 nova_compute[187157]:       </enum>
Dec 02 23:42:20 compute-1 nova_compute[187157]:     </launchSecurity>
Dec 02 23:42:20 compute-1 nova_compute[187157]:   </features>
Dec 02 23:42:20 compute-1 nova_compute[187157]: </domainCapabilities>
Dec 02 23:42:20 compute-1 nova_compute[187157]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.260 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.260 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.261 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.261 187161 INFO nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Secure Boot support detected
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.269 187161 INFO nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.270 187161 INFO nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 23:42:20 compute-1 systemd[1]: Started libvirt nodedev daemon.
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.458 187161 DEBUG nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] cpu compare xml: <cpu match="exact">
Dec 02 23:42:20 compute-1 nova_compute[187157]:   <model>Nehalem</model>
Dec 02 23:42:20 compute-1 nova_compute[187157]: </cpu>
Dec 02 23:42:20 compute-1 nova_compute[187157]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.460 187161 DEBUG nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.552 187161 WARNING nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.552 187161 DEBUG nova.virt.libvirt.volume.mount [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 23:42:20 compute-1 nova_compute[187157]: 2025-12-02 23:42:20.971 187161 INFO nova.virt.node [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Determined node identity a6c5ccbf-f26a-4e87-95da-56336ae0b343 from /var/lib/nova/compute_id
Dec 02 23:42:21 compute-1 nova_compute[187157]: 2025-12-02 23:42:21.487 187161 WARNING nova.compute.manager [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Compute nodes ['a6c5ccbf-f26a-4e87-95da-56336ae0b343'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 02 23:42:22 compute-1 podman[187477]: 2025-12-02 23:42:22.26752889 +0000 UTC m=+0.095477013 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:42:22 compute-1 nova_compute[187157]: 2025-12-02 23:42:22.518 187161 INFO nova.compute.manager [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 02 23:42:23 compute-1 sshd-session[187497]: Accepted publickey for zuul from 192.168.122.30 port 42014 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:42:23 compute-1 systemd-logind[790]: New session 26 of user zuul.
Dec 02 23:42:23 compute-1 systemd[1]: Started Session 26 of User zuul.
Dec 02 23:42:23 compute-1 sshd-session[187497]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.537 187161 WARNING nova.compute.manager [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.539 187161 DEBUG oslo_concurrency.lockutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.539 187161 DEBUG oslo_concurrency.lockutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.539 187161 DEBUG oslo_concurrency.lockutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.540 187161 DEBUG nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.785 187161 WARNING nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.787 187161 DEBUG oslo_concurrency.processutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.824 187161 DEBUG oslo_concurrency.processutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.825 187161 DEBUG nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6180MB free_disk=73.3690299987793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.825 187161 DEBUG oslo_concurrency.lockutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:42:23 compute-1 nova_compute[187157]: 2025-12-02 23:42:23.826 187161 DEBUG oslo_concurrency.lockutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:42:24 compute-1 nova_compute[187157]: 2025-12-02 23:42:24.333 187161 WARNING nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] No compute node record for compute-1.ctlplane.example.com:a6c5ccbf-f26a-4e87-95da-56336ae0b343: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host a6c5ccbf-f26a-4e87-95da-56336ae0b343 could not be found.
Dec 02 23:42:24 compute-1 python3.9[187651]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:42:24 compute-1 nova_compute[187157]: 2025-12-02 23:42:24.846 187161 INFO nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: a6c5ccbf-f26a-4e87-95da-56336ae0b343
Dec 02 23:42:26 compute-1 sudo[187805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yekcmcvjdjtgjcgkplcflogrbybrfvmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718945.414131-53-177771666924505/AnsiballZ_systemd_service.py'
Dec 02 23:42:26 compute-1 sudo[187805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.376 187161 DEBUG nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.377 187161 DEBUG nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:42:23 up 49 min,  0 user,  load average: 1.05, 0.83, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:42:26 compute-1 python3.9[187807]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:42:26 compute-1 systemd[1]: Reloading.
Dec 02 23:42:26 compute-1 systemd-rc-local-generator[187832]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:26 compute-1 systemd-sysv-generator[187838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:26 compute-1 sudo[187805]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.942 187161 INFO nova.scheduler.client.report [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] [req-769724ac-49d7-4d6c-a525-1cc69ac44b58] Created resource provider record via placement API for resource provider with UUID a6c5ccbf-f26a-4e87-95da-56336ae0b343 and name compute-1.ctlplane.example.com.
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.995 187161 DEBUG nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 02 23:42:26 compute-1 nova_compute[187157]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.996 187161 INFO nova.virt.libvirt.host [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] kernel doesn't support AMD SEV
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.996 187161 DEBUG nova.compute.provider_tree [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.996 187161 DEBUG nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:42:26 compute-1 nova_compute[187157]: 2025-12-02 23:42:26.999 187161 DEBUG nova.virt.libvirt.driver [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Libvirt baseline CPU <cpu>
Dec 02 23:42:26 compute-1 nova_compute[187157]:   <arch>x86_64</arch>
Dec 02 23:42:26 compute-1 nova_compute[187157]:   <model>Nehalem</model>
Dec 02 23:42:26 compute-1 nova_compute[187157]:   <vendor>AMD</vendor>
Dec 02 23:42:26 compute-1 nova_compute[187157]:   <topology sockets="8" cores="1" threads="1"/>
Dec 02 23:42:26 compute-1 nova_compute[187157]:   <maxphysaddr mode="emulate" bits="40"/>
Dec 02 23:42:26 compute-1 nova_compute[187157]: </cpu>
Dec 02 23:42:26 compute-1 nova_compute[187157]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Dec 02 23:42:27 compute-1 nova_compute[187157]: 2025-12-02 23:42:27.571 187161 DEBUG nova.scheduler.client.report [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Updated inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Dec 02 23:42:27 compute-1 nova_compute[187157]: 2025-12-02 23:42:27.571 187161 DEBUG nova.compute.provider_tree [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Updating resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 02 23:42:27 compute-1 nova_compute[187157]: 2025-12-02 23:42:27.572 187161 DEBUG nova.compute.provider_tree [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:42:27 compute-1 python3.9[187992]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:42:27 compute-1 network[188009]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:42:27 compute-1 network[188010]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:42:27 compute-1 network[188011]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:42:27 compute-1 nova_compute[187157]: 2025-12-02 23:42:27.784 187161 DEBUG nova.compute.provider_tree [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Updating resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 02 23:42:28 compute-1 nova_compute[187157]: 2025-12-02 23:42:28.298 187161 DEBUG nova.compute.resource_tracker [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:42:28 compute-1 nova_compute[187157]: 2025-12-02 23:42:28.299 187161 DEBUG oslo_concurrency.lockutils [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.473s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:42:28 compute-1 nova_compute[187157]: 2025-12-02 23:42:28.299 187161 DEBUG nova.service [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Dec 02 23:42:28 compute-1 nova_compute[187157]: 2025-12-02 23:42:28.472 187161 DEBUG nova.service [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Dec 02 23:42:28 compute-1 nova_compute[187157]: 2025-12-02 23:42:28.473 187161 DEBUG nova.servicegroup.drivers.db [None req-d30eeb30-fcce-4443-a9ec-b875beec3f31 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Dec 02 23:42:32 compute-1 sudo[188283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuwammcbizfwmqsfzuovlnkradhrsijg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718952.5345109-91-10242339521320/AnsiballZ_systemd_service.py'
Dec 02 23:42:32 compute-1 sudo[188283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:33 compute-1 python3.9[188285]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:42:33 compute-1 sudo[188283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:34 compute-1 sudo[188436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlgheofzezuqtnqedkdhmxjnywtpgqdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718953.7240582-111-97277236113357/AnsiballZ_file.py'
Dec 02 23:42:34 compute-1 sudo[188436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:34 compute-1 python3.9[188438]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:34 compute-1 sudo[188436]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:34 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:34 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:35 compute-1 sudo[188601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlecaoioxfbdkwoxmafmyuutzknqyhig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718954.6959121-127-281174749244759/AnsiballZ_file.py'
Dec 02 23:42:35 compute-1 sudo[188601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:35 compute-1 podman[188563]: 2025-12-02 23:42:35.2037462 +0000 UTC m=+0.147312361 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Dec 02 23:42:35 compute-1 python3.9[188609]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:35 compute-1 sudo[188601]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:36 compute-1 sudo[188766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnaefzypplshaufvkamnizogbczcvul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718955.7485795-145-150852585175377/AnsiballZ_command.py'
Dec 02 23:42:36 compute-1 sudo[188766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:36 compute-1 python3.9[188768]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:42:36 compute-1 sudo[188766]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:37 compute-1 python3.9[188920]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:42:38 compute-1 podman[189025]: 2025-12-02 23:42:38.288571344 +0000 UTC m=+0.115760785 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 23:42:38 compute-1 sudo[189088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcxgjbskggxusenwwotqwkicsrxjftzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718957.8505836-181-281071985445290/AnsiballZ_systemd_service.py'
Dec 02 23:42:38 compute-1 sudo[189088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:38 compute-1 python3.9[189092]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:42:38 compute-1 systemd[1]: Reloading.
Dec 02 23:42:38 compute-1 systemd-rc-local-generator[189114]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:38 compute-1 systemd-sysv-generator[189118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:38 compute-1 sudo[189088]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:39 compute-1 sudo[189277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtjuyevsupcbjqpylvhbsfyixcvecuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718959.2162645-197-115142049650000/AnsiballZ_command.py'
Dec 02 23:42:39 compute-1 sudo[189277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:39 compute-1 python3.9[189279]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:42:39 compute-1 sudo[189277]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:40 compute-1 sudo[189430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxktwerunrolcxuruksyboxlhpmjejhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718960.2806244-215-235620513500197/AnsiballZ_file.py'
Dec 02 23:42:40 compute-1 sudo[189430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:40 compute-1 python3.9[189432]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:42:40 compute-1 sudo[189430]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:41 compute-1 python3.9[189582]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:42 compute-1 python3.9[189734]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:43 compute-1 nova_compute[187157]: 2025-12-02 23:42:43.475 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:42:43 compute-1 python3.9[189855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718962.153269-247-23529848741692/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:42:44 compute-1 nova_compute[187157]: 2025-12-02 23:42:44.027 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:42:44 compute-1 sudo[190005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szvgqgtgabjaiqegymhccapjmlildmef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718963.803814-277-259391481862032/AnsiballZ_group.py'
Dec 02 23:42:44 compute-1 sudo[190005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:44 compute-1 python3.9[190007]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 02 23:42:44 compute-1 sudo[190005]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:45 compute-1 sudo[190157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqjfqmxdopvkhesypcppymhokwkkcncp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718964.9482849-299-189967133901596/AnsiballZ_getent.py'
Dec 02 23:42:45 compute-1 sudo[190157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:45 compute-1 python3.9[190159]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 02 23:42:45 compute-1 sudo[190157]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:46 compute-1 sudo[190310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvnvnsaiucjnwzyfcutotxavrxsdagqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718966.0775573-315-129726133885350/AnsiballZ_group.py'
Dec 02 23:42:46 compute-1 sudo[190310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:46 compute-1 python3.9[190312]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:42:46 compute-1 groupadd[190313]: group added to /etc/group: name=ceilometer, GID=42405
Dec 02 23:42:46 compute-1 groupadd[190313]: group added to /etc/gshadow: name=ceilometer
Dec 02 23:42:46 compute-1 groupadd[190313]: new group: name=ceilometer, GID=42405
Dec 02 23:42:46 compute-1 sudo[190310]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:47 compute-1 sudo[190468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akrfhwcxbcrqghdcazwlieuwalfdmydp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718967.0687118-331-67997154808528/AnsiballZ_user.py'
Dec 02 23:42:47 compute-1 sudo[190468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:47 compute-1 python3.9[190470]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:42:47 compute-1 useradd[190472]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:42:47 compute-1 useradd[190472]: add 'ceilometer' to group 'libvirt'
Dec 02 23:42:47 compute-1 useradd[190472]: add 'ceilometer' to shadow group 'libvirt'
Dec 02 23:42:48 compute-1 sudo[190468]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:49 compute-1 python3.9[190628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:50 compute-1 python3.9[190749]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764718969.1690521-383-117037764780441/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:51 compute-1 python3.9[190899]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:51 compute-1 python3.9[191020]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764718970.5349243-383-128822630976038/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:52 compute-1 python3.9[191170]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:53 compute-1 podman[191265]: 2025-12-02 23:42:53.065905031 +0000 UTC m=+0.123779280 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 02 23:42:53 compute-1 python3.9[191302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764718971.9455285-383-178278984300112/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:53 compute-1 python3.9[191462]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:54 compute-1 python3.9[191614]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:55 compute-1 python3.9[191766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:56 compute-1 python3.9[191887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718975.0221674-501-120042928346835/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:57 compute-1 python3.9[192037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:57 compute-1 python3.9[192113]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:58 compute-1 python3.9[192263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:59 compute-1 python3.9[192384]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718977.9318485-501-123018546864741/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=3a381808a650224f9d664cc68513cbbb45330072 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:59 compute-1 python3.9[192534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:00 compute-1 python3.9[192655]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718979.4051752-501-137226193829574/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:01 compute-1 python3.9[192805]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:43:01.672 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:43:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:43:01.673 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:43:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:43:01.673 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:43:01 compute-1 anacron[7493]: Job `cron.weekly' started
Dec 02 23:43:01 compute-1 anacron[7493]: Job `cron.weekly' terminated
Dec 02 23:43:02 compute-1 python3.9[192929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718980.815634-501-195309739214322/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:02 compute-1 python3.9[193079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:03 compute-1 python3.9[193200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718982.2456746-501-123089408867258/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:04 compute-1 python3.9[193350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:04 compute-1 python3.9[193471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718983.5877664-501-275215166454671/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:05 compute-1 podman[193595]: 2025-12-02 23:43:05.683918418 +0000 UTC m=+0.177214939 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:43:05 compute-1 python3.9[193632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:06 compute-1 python3.9[193768]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718985.14621-501-133977672466821/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:07 compute-1 python3.9[193918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:07 compute-1 python3.9[194039]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718986.632621-501-226097970415466/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:08 compute-1 podman[194163]: 2025-12-02 23:43:08.609592503 +0000 UTC m=+0.083574002 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:43:08 compute-1 python3.9[194202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:09 compute-1 python3.9[194330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718988.1585486-501-263410331140853/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:10 compute-1 python3.9[194480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:10 compute-1 python3.9[194601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718989.700129-501-179338308086280/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:11 compute-1 python3.9[194751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:12 compute-1 python3.9[194827]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:13 compute-1 python3.9[194977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:13 compute-1 python3.9[195053]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:14 compute-1 python3.9[195203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:15 compute-1 python3.9[195279]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:15 compute-1 sudo[195429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpptfnejnimfwpnoioqmnjildsuqxyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718995.3932757-879-35058299521692/AnsiballZ_file.py'
Dec 02 23:43:15 compute-1 sudo[195429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:15 compute-1 python3.9[195431]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:16 compute-1 sudo[195429]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:16 compute-1 sudo[195581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfuatktxnuckqqsotftlvhdzoveiciwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718996.264339-895-20372675910822/AnsiballZ_file.py'
Dec 02 23:43:16 compute-1 sudo[195581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:16 compute-1 python3.9[195583]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:16 compute-1 sudo[195581]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:17 compute-1 sudo[195733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfhaagfagewanfbdbwpgsqjeynnvvxqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718997.1379418-911-82070207203961/AnsiballZ_file.py'
Dec 02 23:43:17 compute-1 sudo[195733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.703 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.703 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.703 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.703 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.704 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.704 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:43:17 compute-1 nova_compute[187157]: 2025-12-02 23:43:17.704 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:17 compute-1 python3.9[195735]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:43:17 compute-1 sudo[195733]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.228 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.229 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.229 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.229 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.481 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.482 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:43:18 compute-1 sudo[195885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwchblkqfvagvuhohltcdknjriqkzbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718998.0344179-927-45406426607726/AnsiballZ_systemd_service.py'
Dec 02 23:43:18 compute-1 sudo[195885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.524 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.524 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.3686752319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.525 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:43:18 compute-1 nova_compute[187157]: 2025-12-02 23:43:18.525 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:43:18 compute-1 python3.9[195888]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:43:18 compute-1 systemd[1]: Reloading.
Dec 02 23:43:18 compute-1 systemd-rc-local-generator[195914]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:18 compute-1 systemd-sysv-generator[195920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:19 compute-1 systemd[1]: Listening on Podman API Socket.
Dec 02 23:43:19 compute-1 sudo[195885]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:19 compute-1 nova_compute[187157]: 2025-12-02 23:43:19.579 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:43:19 compute-1 nova_compute[187157]: 2025-12-02 23:43:19.581 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:43:18 up 50 min,  0 user,  load average: 0.77, 0.78, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:43:19 compute-1 nova_compute[187157]: 2025-12-02 23:43:19.606 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:43:19 compute-1 sudo[196076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdvyeuqxahcqpschjbzqffmpnwvaljoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718999.6159632-945-25054284645169/AnsiballZ_stat.py'
Dec 02 23:43:19 compute-1 sudo[196076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:20 compute-1 python3.9[196078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:20 compute-1 sudo[196076]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:20 compute-1 nova_compute[187157]: 2025-12-02 23:43:20.113 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:43:20 compute-1 sudo[196199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyifqfptfkgbprwtxjljxoluooywyex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718999.6159632-945-25054284645169/AnsiballZ_copy.py'
Dec 02 23:43:20 compute-1 sudo[196199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:20 compute-1 nova_compute[187157]: 2025-12-02 23:43:20.621 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:43:20 compute-1 nova_compute[187157]: 2025-12-02 23:43:20.622 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:43:20 compute-1 python3.9[196201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718999.6159632-945-25054284645169/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:43:20 compute-1 sudo[196199]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:21 compute-1 sudo[196351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouzqoxgvabgdsijjrgjgtlsosdsenxgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719001.181368-979-78277269388316/AnsiballZ_container_config_data.py'
Dec 02 23:43:21 compute-1 sudo[196351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:21 compute-1 python3.9[196353]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 02 23:43:21 compute-1 sudo[196351]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:22 compute-1 sudo[196503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jncwbvcqsruwrrhftxizkmcfvkzptqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719002.2407932-997-260039553257068/AnsiballZ_container_config_hash.py'
Dec 02 23:43:22 compute-1 sudo[196503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:23 compute-1 python3.9[196505]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:43:23 compute-1 sudo[196503]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:23 compute-1 podman[196530]: 2025-12-02 23:43:23.273860131 +0000 UTC m=+0.111831883 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:43:23 compute-1 sudo[196675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enetcipsexwxjhtolvfwmntkrhfrwbzz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764719003.4375646-1017-237204030065660/AnsiballZ_edpm_container_manage.py'
Dec 02 23:43:23 compute-1 sudo[196675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:24 compute-1 python3[196677]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:43:25 compute-1 podman[196692]: 2025-12-02 23:43:25.662765346 +0000 UTC m=+1.335093052 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 02 23:43:25 compute-1 podman[196790]: 2025-12-02 23:43:25.870526444 +0000 UTC m=+0.076261578 container create ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:43:25 compute-1 podman[196790]: 2025-12-02 23:43:25.833671727 +0000 UTC m=+0.039406911 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 02 23:43:25 compute-1 python3[196677]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 02 23:43:26 compute-1 sudo[196675]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:26 compute-1 sudo[196978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awygfiervjphbstdlluecplyaxqzbxms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719006.324022-1033-227554654819095/AnsiballZ_stat.py'
Dec 02 23:43:26 compute-1 sudo[196978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:26 compute-1 python3.9[196980]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:43:27 compute-1 sudo[196978]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:27 compute-1 sudo[197132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgljrdxslldsfpmhzgkpdefgpzpmkauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719007.3634028-1051-141591549185628/AnsiballZ_file.py'
Dec 02 23:43:27 compute-1 sudo[197132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:28 compute-1 python3.9[197134]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:28 compute-1 sudo[197132]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:28 compute-1 sudo[197283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djutdtklycdlstnlokmhujrxqjxanefv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719008.1057894-1051-230114766858146/AnsiballZ_copy.py'
Dec 02 23:43:28 compute-1 sudo[197283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:28 compute-1 python3.9[197285]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764719008.1057894-1051-230114766858146/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:28 compute-1 sudo[197283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:29 compute-1 sudo[197359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmuaqskimadgtfyibtpfctztdvivjrkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719008.1057894-1051-230114766858146/AnsiballZ_systemd.py'
Dec 02 23:43:29 compute-1 sudo[197359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:29 compute-1 python3.9[197361]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:43:29 compute-1 systemd[1]: Reloading.
Dec 02 23:43:29 compute-1 systemd-rc-local-generator[197389]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:29 compute-1 systemd-sysv-generator[197392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:29 compute-1 sudo[197359]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:30 compute-1 sudo[197470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrtogvvfjjaaangffqwrwbkeabjjjjts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719008.1057894-1051-230114766858146/AnsiballZ_systemd.py'
Dec 02 23:43:30 compute-1 sudo[197470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:30 compute-1 python3.9[197472]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:43:30 compute-1 systemd[1]: Reloading.
Dec 02 23:43:30 compute-1 systemd-rc-local-generator[197502]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:30 compute-1 systemd-sysv-generator[197506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:30 compute-1 systemd[1]: Starting podman_exporter container...
Dec 02 23:43:31 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:43:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be4e3eef638878dc584b854bf0466db9438ef9c1317fd34df57d1a7d64068992/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be4e3eef638878dc584b854bf0466db9438ef9c1317fd34df57d1a7d64068992/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:31 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.
Dec 02 23:43:31 compute-1 podman[197511]: 2025-12-02 23:43:31.125105334 +0000 UTC m=+0.164848921 container init ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.150Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.151Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.151Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.151Z caller=handler.go:105 level=info collector=container
Dec 02 23:43:31 compute-1 podman[197511]: 2025-12-02 23:43:31.157388867 +0000 UTC m=+0.197132424 container start ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:43:31 compute-1 podman[197511]: podman_exporter
Dec 02 23:43:31 compute-1 systemd[1]: Starting Podman API Service...
Dec 02 23:43:31 compute-1 systemd[1]: Started Podman API Service.
Dec 02 23:43:31 compute-1 systemd[1]: Started podman_exporter container.
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="Setting parallel job count to 25"
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="Using sqlite as database backend"
Dec 02 23:43:31 compute-1 sudo[197470]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 02 23:43:31 compute-1 podman[197537]: @ - - [02/Dec/2025:23:43:31 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 02 23:43:31 compute-1 podman[197537]: time="2025-12-02T23:43:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:43:31 compute-1 podman[197535]: 2025-12-02 23:43:31.264565583 +0000 UTC m=+0.087713873 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:43:31 compute-1 systemd[1]: ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28-741e0ad89012e67e.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:43:31 compute-1 systemd[1]: ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28-741e0ad89012e67e.service: Failed with result 'exit-code'.
Dec 02 23:43:31 compute-1 podman[197537]: @ - - [02/Dec/2025:23:43:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14040 "" "Go-http-client/1.1"
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.288Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.289Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 02 23:43:31 compute-1 podman_exporter[197526]: ts=2025-12-02T23:43:31.290Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 02 23:43:31 compute-1 sudo[197721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeuziczginhkibdyaajrkgnkvwnufqam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719011.5025542-1099-204524592522123/AnsiballZ_systemd.py'
Dec 02 23:43:31 compute-1 sudo[197721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:32 compute-1 python3.9[197723]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:43:32 compute-1 systemd[1]: Stopping podman_exporter container...
Dec 02 23:43:32 compute-1 podman[197537]: @ - - [02/Dec/2025:23:43:31 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec 02 23:43:32 compute-1 systemd[1]: libpod-ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.scope: Deactivated successfully.
Dec 02 23:43:32 compute-1 podman[197727]: 2025-12-02 23:43:32.533658662 +0000 UTC m=+0.066002693 container died ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:43:32 compute-1 systemd[1]: ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28-741e0ad89012e67e.timer: Deactivated successfully.
Dec 02 23:43:32 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.
Dec 02 23:43:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28-userdata-shm.mount: Deactivated successfully.
Dec 02 23:43:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-be4e3eef638878dc584b854bf0466db9438ef9c1317fd34df57d1a7d64068992-merged.mount: Deactivated successfully.
Dec 02 23:43:32 compute-1 podman[197727]: 2025-12-02 23:43:32.856642527 +0000 UTC m=+0.388986558 container cleanup ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:43:32 compute-1 podman[197727]: podman_exporter
Dec 02 23:43:32 compute-1 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 23:43:32 compute-1 podman[197756]: podman_exporter
Dec 02 23:43:32 compute-1 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 02 23:43:32 compute-1 systemd[1]: Stopped podman_exporter container.
Dec 02 23:43:32 compute-1 systemd[1]: Starting podman_exporter container...
Dec 02 23:43:33 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:43:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be4e3eef638878dc584b854bf0466db9438ef9c1317fd34df57d1a7d64068992/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be4e3eef638878dc584b854bf0466db9438ef9c1317fd34df57d1a7d64068992/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:33 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.
Dec 02 23:43:33 compute-1 podman[197769]: 2025-12-02 23:43:33.162328871 +0000 UTC m=+0.173622160 container init ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.186Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.186Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.186Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.186Z caller=handler.go:105 level=info collector=container
Dec 02 23:43:33 compute-1 podman[197537]: @ - - [02/Dec/2025:23:43:33 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 02 23:43:33 compute-1 podman[197537]: time="2025-12-02T23:43:33Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:43:33 compute-1 podman[197769]: 2025-12-02 23:43:33.207422343 +0000 UTC m=+0.218715602 container start ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:43:33 compute-1 podman[197769]: podman_exporter
Dec 02 23:43:33 compute-1 podman[197537]: @ - - [02/Dec/2025:23:43:33 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14042 "" "Go-http-client/1.1"
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.215Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.216Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 02 23:43:33 compute-1 podman_exporter[197785]: ts=2025-12-02T23:43:33.216Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 02 23:43:33 compute-1 systemd[1]: Started podman_exporter container.
Dec 02 23:43:33 compute-1 sudo[197721]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:33 compute-1 podman[197794]: 2025-12-02 23:43:33.296690793 +0000 UTC m=+0.082562115 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:43:33 compute-1 sudo[197968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnuaktthveievfhgjxwidjeetrbftmyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719013.5325634-1115-12204079379831/AnsiballZ_stat.py'
Dec 02 23:43:33 compute-1 sudo[197968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:34 compute-1 python3.9[197970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:34 compute-1 sudo[197968]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:34 compute-1 sudo[198091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxuwmiahockeubuvraspvsruopuuskse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719013.5325634-1115-12204079379831/AnsiballZ_copy.py'
Dec 02 23:43:34 compute-1 sudo[198091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:34 compute-1 python3.9[198093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764719013.5325634-1115-12204079379831/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:43:34 compute-1 sudo[198091]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:35 compute-1 sudo[198243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfvkvzrofqxuqnxxyeltcgembqlyabmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719015.2387767-1149-127842325424453/AnsiballZ_container_config_data.py'
Dec 02 23:43:35 compute-1 sudo[198243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:35 compute-1 python3.9[198245]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 02 23:43:35 compute-1 sudo[198243]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:36 compute-1 podman[198270]: 2025-12-02 23:43:36.329874285 +0000 UTC m=+0.160768389 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Dec 02 23:43:36 compute-1 sudo[198421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsonildqngnuwumoyzbosmqjkwistwjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719016.2107449-1167-59229228011300/AnsiballZ_container_config_hash.py'
Dec 02 23:43:36 compute-1 sudo[198421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:36 compute-1 python3.9[198423]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:43:36 compute-1 sudo[198421]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:37 compute-1 sudo[198573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quitmvxehyjktqrpsmsadilbgnltjqzo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764719017.3147962-1187-7823532731534/AnsiballZ_edpm_container_manage.py'
Dec 02 23:43:37 compute-1 sudo[198573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:37 compute-1 python3[198575]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:43:40 compute-1 podman[198616]: 2025-12-02 23:43:40.078932134 +0000 UTC m=+0.907882634 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:43:40 compute-1 podman[198588]: 2025-12-02 23:43:40.439038202 +0000 UTC m=+2.426124792 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 02 23:43:40 compute-1 podman[198703]: 2025-12-02 23:43:40.632616728 +0000 UTC m=+0.059479191 container create c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Dec 02 23:43:40 compute-1 podman[198703]: 2025-12-02 23:43:40.600658733 +0000 UTC m=+0.027521246 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 02 23:43:40 compute-1 python3[198575]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 02 23:43:40 compute-1 sudo[198573]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:41 compute-1 sudo[198892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmrycmfastzivpalozwylbuaarytcwrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719021.0838542-1203-114849955373768/AnsiballZ_stat.py'
Dec 02 23:43:41 compute-1 sudo[198892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:41 compute-1 python3.9[198894]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:43:41 compute-1 sudo[198892]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:42 compute-1 sudo[199046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cztetkkgshhsemwffspvdzrhnmcirefi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.0753589-1221-52052446993128/AnsiballZ_file.py'
Dec 02 23:43:42 compute-1 sudo[199046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:42 compute-1 python3.9[199048]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:42 compute-1 sudo[199046]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:43 compute-1 sudo[199197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdzbnhvsvrywlzgfjvjaoztxwtrdtjoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.8150132-1221-88103476544278/AnsiballZ_copy.py'
Dec 02 23:43:43 compute-1 sudo[199197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:43 compute-1 python3.9[199199]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764719022.8150132-1221-88103476544278/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:43 compute-1 sudo[199197]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:44 compute-1 sudo[199273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfvmiptvgxdlupvgvfndkckyfaiosygt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.8150132-1221-88103476544278/AnsiballZ_systemd.py'
Dec 02 23:43:44 compute-1 sudo[199273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:44 compute-1 python3.9[199275]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:43:44 compute-1 systemd[1]: Reloading.
Dec 02 23:43:44 compute-1 systemd-sysv-generator[199306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:44 compute-1 systemd-rc-local-generator[199303]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:44 compute-1 sudo[199273]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:45 compute-1 sudo[199384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gitzoaswkaknkvclljpkjntksbcodwvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.8150132-1221-88103476544278/AnsiballZ_systemd.py'
Dec 02 23:43:45 compute-1 sudo[199384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:45 compute-1 python3.9[199386]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:43:45 compute-1 systemd[1]: Reloading.
Dec 02 23:43:45 compute-1 systemd-rc-local-generator[199408]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:45 compute-1 systemd-sysv-generator[199413]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:45 compute-1 systemd[1]: Starting openstack_network_exporter container...
Dec 02 23:43:46 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:43:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:46 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.
Dec 02 23:43:46 compute-1 podman[199427]: 2025-12-02 23:43:46.165004498 +0000 UTC m=+0.193838903 container init c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *bridge.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *coverage.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *datapath.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *iface.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *memory.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *ovnnorthd.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *ovn.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *ovsdbserver.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *pmd_perf.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *pmd_rxq.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: INFO    23:43:46 main.go:48: registering *vswitch.Collector
Dec 02 23:43:46 compute-1 openstack_network_exporter[199442]: NOTICE  23:43:46 main.go:76: listening on https://:9105/metrics
Dec 02 23:43:46 compute-1 podman[199427]: 2025-12-02 23:43:46.2021016 +0000 UTC m=+0.230935955 container start c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Dec 02 23:43:46 compute-1 podman[199427]: openstack_network_exporter
Dec 02 23:43:46 compute-1 systemd[1]: Started openstack_network_exporter container.
Dec 02 23:43:46 compute-1 sudo[199384]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:46 compute-1 podman[199452]: 2025-12-02 23:43:46.326204068 +0000 UTC m=+0.109084245 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-type=git)
Dec 02 23:43:46 compute-1 sudo[199624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiexghtzmkyknyvfezcavmygiwjzzxtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719026.55416-1269-248606400300705/AnsiballZ_systemd.py'
Dec 02 23:43:46 compute-1 sudo[199624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:47 compute-1 python3.9[199626]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:43:47 compute-1 systemd[1]: Stopping openstack_network_exporter container...
Dec 02 23:43:47 compute-1 systemd[1]: libpod-c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.scope: Deactivated successfully.
Dec 02 23:43:47 compute-1 podman[199630]: 2025-12-02 23:43:47.392933513 +0000 UTC m=+0.065165582 container died c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 23:43:47 compute-1 systemd[1]: c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f-20b078a07da05204.timer: Deactivated successfully.
Dec 02 23:43:47 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.
Dec 02 23:43:47 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f-userdata-shm.mount: Deactivated successfully.
Dec 02 23:43:47 compute-1 systemd[1]: var-lib-containers-storage-overlay-89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f-merged.mount: Deactivated successfully.
Dec 02 23:43:48 compute-1 podman[199630]: 2025-12-02 23:43:48.564280291 +0000 UTC m=+1.236512370 container cleanup c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=)
Dec 02 23:43:48 compute-1 podman[199630]: openstack_network_exporter
Dec 02 23:43:48 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 23:43:48 compute-1 podman[199657]: openstack_network_exporter
Dec 02 23:43:48 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 02 23:43:48 compute-1 systemd[1]: Stopped openstack_network_exporter container.
Dec 02 23:43:48 compute-1 systemd[1]: Starting openstack_network_exporter container...
Dec 02 23:43:48 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:43:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a13aee767374e4e497e5f23913f8fedeb57718ae49d9dfeaff423cba6a3f2f/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:48 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.
Dec 02 23:43:48 compute-1 podman[199670]: 2025-12-02 23:43:48.889836599 +0000 UTC m=+0.177186838 container init c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public)
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *bridge.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *coverage.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *datapath.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *iface.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *memory.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *ovnnorthd.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *ovn.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *ovsdbserver.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *pmd_perf.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *pmd_rxq.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: INFO    23:43:48 main.go:48: registering *vswitch.Collector
Dec 02 23:43:48 compute-1 openstack_network_exporter[199685]: NOTICE  23:43:48 main.go:76: listening on https://:9105/metrics
Dec 02 23:43:48 compute-1 podman[199670]: 2025-12-02 23:43:48.929241979 +0000 UTC m=+0.216592168 container start c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:43:48 compute-1 podman[199670]: openstack_network_exporter
Dec 02 23:43:48 compute-1 systemd[1]: Started openstack_network_exporter container.
Dec 02 23:43:48 compute-1 sudo[199624]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:49 compute-1 podman[199695]: 2025-12-02 23:43:49.068445832 +0000 UTC m=+0.121079743 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:43:49 compute-1 sudo[199865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpjbrvkjyeozwukewqpzgndussndfvkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719029.2458706-1285-113149873461511/AnsiballZ_find.py'
Dec 02 23:43:49 compute-1 sudo[199865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:49 compute-1 python3.9[199867]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:43:49 compute-1 sudo[199865]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:50 compute-1 sudo[200017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmzabrorarymuqzouizqadpljhgysmlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719030.3705535-1303-205994808620987/AnsiballZ_podman_container_info.py'
Dec 02 23:43:50 compute-1 sudo[200017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:51 compute-1 python3.9[200019]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 02 23:43:51 compute-1 sudo[200017]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:52 compute-1 sudo[200182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djjcgugbqevtgxydthvxktjghnenzopi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719031.5801313-1311-181198081496528/AnsiballZ_podman_container_exec.py'
Dec 02 23:43:52 compute-1 sudo[200182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:52 compute-1 python3.9[200184]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:43:52 compute-1 systemd[1]: Started libpod-conmon-a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda.scope.
Dec 02 23:43:52 compute-1 podman[200185]: 2025-12-02 23:43:52.432696889 +0000 UTC m=+0.082511703 container exec a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Dec 02 23:43:52 compute-1 podman[200185]: 2025-12-02 23:43:52.44676936 +0000 UTC m=+0.096584174 container exec_died a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 02 23:43:52 compute-1 sudo[200182]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:52 compute-1 systemd[1]: libpod-conmon-a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda.scope: Deactivated successfully.
Dec 02 23:43:53 compute-1 sudo[200367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uquxzvqmojazlltwjirndlpfacjsqmwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719032.7296236-1319-18246724955712/AnsiballZ_podman_container_exec.py'
Dec 02 23:43:53 compute-1 sudo[200367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:53 compute-1 python3.9[200369]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:43:53 compute-1 systemd[1]: Started libpod-conmon-a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda.scope.
Dec 02 23:43:53 compute-1 podman[200370]: 2025-12-02 23:43:53.405828916 +0000 UTC m=+0.094656095 container exec a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 23:43:53 compute-1 podman[200370]: 2025-12-02 23:43:53.440098839 +0000 UTC m=+0.128926018 container exec_died a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:43:53 compute-1 sudo[200367]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:53 compute-1 systemd[1]: libpod-conmon-a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda.scope: Deactivated successfully.
Dec 02 23:43:53 compute-1 podman[200387]: 2025-12-02 23:43:53.571291972 +0000 UTC m=+0.159321174 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:43:54 compute-1 sudo[200571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udciufuskueiwbdknryjpojepyvsnvwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719033.7621212-1327-213516689169983/AnsiballZ_file.py'
Dec 02 23:43:54 compute-1 sudo[200571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:54 compute-1 python3.9[200573]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:54 compute-1 sudo[200571]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:55 compute-1 sudo[200723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyohtmycxrarnqjuogvkzzzxyghkhoyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719034.7138424-1336-9402740511993/AnsiballZ_podman_container_info.py'
Dec 02 23:43:55 compute-1 sudo[200723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:55 compute-1 python3.9[200725]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 02 23:43:55 compute-1 sudo[200723]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:56 compute-1 sudo[200888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkcwvdnhyycmxarwngslfshsjklrhdlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719035.686802-1344-125073558818814/AnsiballZ_podman_container_exec.py'
Dec 02 23:43:56 compute-1 sudo[200888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:56 compute-1 python3.9[200890]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:43:56 compute-1 systemd[1]: Started libpod-conmon-7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633.scope.
Dec 02 23:43:56 compute-1 podman[200891]: 2025-12-02 23:43:56.440357851 +0000 UTC m=+0.091270161 container exec 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:43:56 compute-1 podman[200891]: 2025-12-02 23:43:56.450923374 +0000 UTC m=+0.101835704 container exec_died 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:43:56 compute-1 sudo[200888]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:56 compute-1 systemd[1]: libpod-conmon-7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633.scope: Deactivated successfully.
Dec 02 23:43:57 compute-1 sudo[201072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaovppkuvnhndebkxzsqnocusmaxcoiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719036.7473483-1352-40970105840414/AnsiballZ_podman_container_exec.py'
Dec 02 23:43:57 compute-1 sudo[201072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:57 compute-1 python3.9[201074]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:43:57 compute-1 systemd[1]: Started libpod-conmon-7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633.scope.
Dec 02 23:43:57 compute-1 podman[201075]: 2025-12-02 23:43:57.455855032 +0000 UTC m=+0.123515023 container exec 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:43:57 compute-1 podman[201075]: 2025-12-02 23:43:57.46218033 +0000 UTC m=+0.129840331 container exec_died 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Dec 02 23:43:57 compute-1 sudo[201072]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:57 compute-1 systemd[1]: libpod-conmon-7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633.scope: Deactivated successfully.
Dec 02 23:43:58 compute-1 sudo[201258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poktiuetiulnvqxjjfsvohruakiszloy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719037.7290297-1360-32757991965444/AnsiballZ_file.py'
Dec 02 23:43:58 compute-1 sudo[201258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:58 compute-1 python3.9[201260]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:58 compute-1 sudo[201258]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:58 compute-1 sshd-session[201107]: Received disconnect from 193.46.255.33 port 26070:11:  [preauth]
Dec 02 23:43:58 compute-1 sshd-session[201107]: Disconnected from authenticating user root 193.46.255.33 port 26070 [preauth]
Dec 02 23:43:59 compute-1 sudo[201412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvoyipdbyangovmkhmfuiswajojbllbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719038.7680948-1369-192740612771902/AnsiballZ_podman_container_info.py'
Dec 02 23:43:59 compute-1 sudo[201412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:59 compute-1 python3.9[201414]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 02 23:43:59 compute-1 sudo[201412]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:59 compute-1 sshd-session[201367]: Invalid user ubuntu from 193.32.162.146 port 58590
Dec 02 23:43:59 compute-1 sshd-session[201367]: Connection closed by invalid user ubuntu 193.32.162.146 port 58590 [preauth]
Dec 02 23:44:00 compute-1 sudo[201578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnzjkyfncvhhejeuqcybholcnzsslnqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719039.7377763-1377-15935001856187/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:00 compute-1 sudo[201578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:00 compute-1 python3.9[201580]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:00 compute-1 systemd[1]: Started libpod-conmon-135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.scope.
Dec 02 23:44:00 compute-1 podman[201581]: 2025-12-02 23:44:00.495008212 +0000 UTC m=+0.102448999 container exec 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 02 23:44:00 compute-1 podman[201581]: 2025-12-02 23:44:00.530925925 +0000 UTC m=+0.138366712 container exec_died 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:44:00 compute-1 systemd[1]: libpod-conmon-135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.scope: Deactivated successfully.
Dec 02 23:44:00 compute-1 sudo[201578]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:01 compute-1 sudo[201761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiggnvjmicwgjlhlbhscyvjphpxustht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719040.8261123-1385-247253902276683/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:01 compute-1 sudo[201761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:01 compute-1 auditd[703]: Audit daemon rotating log files
Dec 02 23:44:01 compute-1 python3.9[201763]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:01 compute-1 systemd[1]: Started libpod-conmon-135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.scope.
Dec 02 23:44:01 compute-1 podman[201764]: 2025-12-02 23:44:01.602263535 +0000 UTC m=+0.105753131 container exec 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:44:01 compute-1 podman[201764]: 2025-12-02 23:44:01.637865441 +0000 UTC m=+0.141355047 container exec_died 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 02 23:44:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:44:01.675 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:44:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:44:01.677 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:44:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:44:01.678 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:44:01 compute-1 systemd[1]: libpod-conmon-135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7.scope: Deactivated successfully.
Dec 02 23:44:01 compute-1 sudo[201761]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:02 compute-1 sudo[201947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-getwwlkmfedjbgccrtunfnwnsurgnqkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719041.9461377-1393-128820571593040/AnsiballZ_file.py'
Dec 02 23:44:02 compute-1 sudo[201947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:02 compute-1 python3.9[201949]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:02 compute-1 sudo[201947]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:03 compute-1 sudo[202099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqmiiltqmyavmtjvngdikypeyfkmfayk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719042.80446-1402-220167117852331/AnsiballZ_podman_container_info.py'
Dec 02 23:44:03 compute-1 sudo[202099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:03 compute-1 python3.9[202101]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 02 23:44:03 compute-1 sudo[202099]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:04 compute-1 podman[202220]: 2025-12-02 23:44:04.277990086 +0000 UTC m=+0.103845475 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:44:04 compute-1 sudo[202283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdtopxjhugmdrjhplcakldcuywtugfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719043.871302-1410-69646444505296/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:04 compute-1 sudo[202283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:04 compute-1 python3.9[202288]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:04 compute-1 systemd[1]: Started libpod-conmon-ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.scope.
Dec 02 23:44:04 compute-1 podman[202289]: 2025-12-02 23:44:04.659658209 +0000 UTC m=+0.116779005 container exec ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:44:04 compute-1 podman[202289]: 2025-12-02 23:44:04.696144997 +0000 UTC m=+0.153265793 container exec_died ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:44:04 compute-1 systemd[1]: libpod-conmon-ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.scope: Deactivated successfully.
Dec 02 23:44:04 compute-1 sudo[202283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:05 compute-1 sudo[202471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzmdhwhoqwuzxlbsjlefeknervsfkrys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719045.027122-1418-45493569856928/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:05 compute-1 sudo[202471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:05 compute-1 python3.9[202473]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:05 compute-1 systemd[1]: Started libpod-conmon-ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.scope.
Dec 02 23:44:05 compute-1 podman[202474]: 2025-12-02 23:44:05.775849575 +0000 UTC m=+0.103742502 container exec ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:44:05 compute-1 podman[202474]: 2025-12-02 23:44:05.811162714 +0000 UTC m=+0.139055591 container exec_died ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:44:05 compute-1 systemd[1]: libpod-conmon-ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28.scope: Deactivated successfully.
Dec 02 23:44:05 compute-1 sudo[202471]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:06 compute-1 sudo[202667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawskxzchgkpwqiiaftwewvjemtwvurf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719046.1343546-1426-192929915616582/AnsiballZ_file.py'
Dec 02 23:44:06 compute-1 sudo[202667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:06 compute-1 podman[202628]: 2025-12-02 23:44:06.625849509 +0000 UTC m=+0.146752091 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:44:06 compute-1 python3.9[202675]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:06 compute-1 sudo[202667]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:07 compute-1 sudo[202833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnsdqnqjeujiyqkazzvgeudrtrabxuvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719047.0452633-1435-76306388450723/AnsiballZ_podman_container_info.py'
Dec 02 23:44:07 compute-1 sudo[202833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:07 compute-1 python3.9[202835]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 02 23:44:07 compute-1 sudo[202833]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:08 compute-1 sudo[202998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuywyaqahqhwrtwmrfppszwdpkfbhpxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719047.9955883-1443-164491819076/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:08 compute-1 sudo[202998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:08 compute-1 python3.9[203000]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:08 compute-1 systemd[1]: Started libpod-conmon-c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.scope.
Dec 02 23:44:08 compute-1 podman[203001]: 2025-12-02 23:44:08.732909653 +0000 UTC m=+0.105152097 container exec c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Dec 02 23:44:08 compute-1 podman[203001]: 2025-12-02 23:44:08.746980963 +0000 UTC m=+0.119223357 container exec_died c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:44:08 compute-1 sudo[202998]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:08 compute-1 systemd[1]: libpod-conmon-c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.scope: Deactivated successfully.
Dec 02 23:44:09 compute-1 sudo[203182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohqjcnuiaehgurmrhcpbiiztcyogeojz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719049.0407658-1451-91338090530547/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:09 compute-1 sudo[203182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:09 compute-1 python3.9[203184]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:09 compute-1 systemd[1]: Started libpod-conmon-c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.scope.
Dec 02 23:44:09 compute-1 podman[203185]: 2025-12-02 23:44:09.767885528 +0000 UTC m=+0.095885475 container exec c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6)
Dec 02 23:44:09 compute-1 podman[203185]: 2025-12-02 23:44:09.803954046 +0000 UTC m=+0.131953993 container exec_died c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Dec 02 23:44:09 compute-1 systemd[1]: libpod-conmon-c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f.scope: Deactivated successfully.
Dec 02 23:44:09 compute-1 sudo[203182]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:10 compute-1 sudo[203378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgieezupobmljnrtcdjdheqockavkeqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719050.0661047-1459-107286258132524/AnsiballZ_file.py'
Dec 02 23:44:10 compute-1 sudo[203378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:10 compute-1 podman[203337]: 2025-12-02 23:44:10.510359489 +0000 UTC m=+0.068068625 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:44:10 compute-1 python3.9[203384]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:10 compute-1 sudo[203378]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:19 compute-1 podman[203409]: 2025-12-02 23:44:19.260970865 +0000 UTC m=+0.093226001 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350)
Dec 02 23:44:20 compute-1 nova_compute[187157]: 2025-12-02 23:44:20.615 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:20 compute-1 nova_compute[187157]: 2025-12-02 23:44:20.615 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.143 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.143 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.143 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.144 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.144 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.144 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.144 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.145 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.711 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.712 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.712 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.712 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.898 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.899 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.924 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.925 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6057MB free_disk=73.20050048828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.925 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:44:21 compute-1 nova_compute[187157]: 2025-12-02 23:44:21.926 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:44:22 compute-1 nova_compute[187157]: 2025-12-02 23:44:22.990 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:44:22 compute-1 nova_compute[187157]: 2025-12-02 23:44:22.991 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:44:21 up 51 min,  0 user,  load average: 0.60, 0.73, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:44:23 compute-1 nova_compute[187157]: 2025-12-02 23:44:23.020 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:44:23 compute-1 nova_compute[187157]: 2025-12-02 23:44:23.532 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:44:24 compute-1 nova_compute[187157]: 2025-12-02 23:44:24.043 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:44:24 compute-1 nova_compute[187157]: 2025-12-02 23:44:24.044 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:44:24 compute-1 podman[203432]: 2025-12-02 23:44:24.268513932 +0000 UTC m=+0.106963422 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4)
Dec 02 23:44:29 compute-1 sudo[203577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txperjiokzkngkvfaswrxxvbfgdjlsgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719069.3801951-1635-18239731919602/AnsiballZ_file.py'
Dec 02 23:44:29 compute-1 sudo[203577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:30 compute-1 python3.9[203579]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:30 compute-1 sudo[203577]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:30 compute-1 sudo[203729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltcmhvmfslzrpwuazdziwytmntxuuvil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719070.2560813-1651-13281570482599/AnsiballZ_stat.py'
Dec 02 23:44:30 compute-1 sudo[203729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:30 compute-1 python3.9[203731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:30 compute-1 sudo[203729]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:31 compute-1 sudo[203852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxrvzjytxukeuqspsejdwdaqaqfyxiwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719070.2560813-1651-13281570482599/AnsiballZ_copy.py'
Dec 02 23:44:31 compute-1 sudo[203852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:31 compute-1 python3.9[203854]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764719070.2560813-1651-13281570482599/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:31 compute-1 sudo[203852]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:32 compute-1 sudo[204004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqjnxbeszejaxqfdypteevfgirfxxury ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719072.1330109-1683-98870704278347/AnsiballZ_file.py'
Dec 02 23:44:33 compute-1 sudo[204004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:33 compute-1 python3.9[204006]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:33 compute-1 sudo[204004]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:33 compute-1 sudo[204156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcqlyendsqfjqahbhrgsrddhfkqcolne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719073.5438464-1699-178081836491422/AnsiballZ_stat.py'
Dec 02 23:44:33 compute-1 sudo[204156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:34 compute-1 python3.9[204158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:34 compute-1 sudo[204156]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:34 compute-1 podman[204208]: 2025-12-02 23:44:34.500677576 +0000 UTC m=+0.062243723 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:44:34 compute-1 sudo[204252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gosxuyfhnvejduyjlaopcwsaogfcebcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719073.5438464-1699-178081836491422/AnsiballZ_file.py'
Dec 02 23:44:34 compute-1 sudo[204252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:34 compute-1 python3.9[204261]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:34 compute-1 sudo[204252]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:35 compute-1 sudo[204411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bezekenmupmbmpzbkashzwdcielhoadw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719074.965776-1723-150131881115665/AnsiballZ_stat.py'
Dec 02 23:44:35 compute-1 sudo[204411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:35 compute-1 python3.9[204413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:35 compute-1 sudo[204411]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:35 compute-1 sudo[204489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-josytxhjaubzpqkbctpmdyugykcgcgkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719074.965776-1723-150131881115665/AnsiballZ_file.py'
Dec 02 23:44:35 compute-1 sudo[204489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:36 compute-1 python3.9[204491]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.97aqpwtc recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:36 compute-1 sudo[204489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:36 compute-1 sudo[204656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrxgzgzmyxpiyvgimgnzsqtebgzqofrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719076.4500303-1747-199056091366707/AnsiballZ_stat.py'
Dec 02 23:44:36 compute-1 sudo[204656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:36 compute-1 podman[204615]: 2025-12-02 23:44:36.968963648 +0000 UTC m=+0.131378282 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:44:37 compute-1 python3.9[204663]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:37 compute-1 sudo[204656]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:37 compute-1 sudo[204745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphhkfxkipucgzqvzcgkbpspqpaatogp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719076.4500303-1747-199056091366707/AnsiballZ_file.py'
Dec 02 23:44:37 compute-1 sudo[204745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:37 compute-1 python3.9[204747]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:37 compute-1 sudo[204745]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:38 compute-1 sudo[204897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvbygdpvjkvovmdbstgzpjmycfszdawm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719078.0400798-1773-163289704548586/AnsiballZ_command.py'
Dec 02 23:44:38 compute-1 sudo[204897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:38 compute-1 python3.9[204899]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:38 compute-1 sudo[204897]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:39 compute-1 sudo[205050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubkmuzqztpplewzvhivgmdsevzbldyic ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764719078.9685078-1789-199210606463521/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:44:39 compute-1 sudo[205050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:39 compute-1 python3[205052]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:44:39 compute-1 sudo[205050]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:40 compute-1 sudo[205202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myckdaglpoyplsgdwvsamztftydtjpfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719079.9387171-1805-176891578245246/AnsiballZ_stat.py'
Dec 02 23:44:40 compute-1 sudo[205202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:40 compute-1 python3.9[205204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:40 compute-1 sudo[205202]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:40 compute-1 podman[205254]: 2025-12-02 23:44:40.922115667 +0000 UTC m=+0.074300324 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 02 23:44:40 compute-1 sudo[205295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlkmaznwgcvmkizmrkoimskmgdliytfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719079.9387171-1805-176891578245246/AnsiballZ_file.py'
Dec 02 23:44:40 compute-1 sudo[205295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:41 compute-1 python3.9[205303]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:41 compute-1 sudo[205295]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:41 compute-1 sudo[205453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsvdavhapyqcefuuwmtvpwnlipedggmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719081.4245827-1830-29533806918658/AnsiballZ_stat.py'
Dec 02 23:44:41 compute-1 sudo[205453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:42 compute-1 python3.9[205455]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:42 compute-1 sudo[205453]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:42 compute-1 sudo[205531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xminrriqopevjhajdvhjqvzwoodulxad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719081.4245827-1830-29533806918658/AnsiballZ_file.py'
Dec 02 23:44:42 compute-1 sudo[205531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:42 compute-1 python3.9[205533]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:42 compute-1 sudo[205531]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:43 compute-1 sudo[205683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xevjmcjxfkczbyswiwmvnycnkessaqsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719082.8670716-1853-84651708585602/AnsiballZ_stat.py'
Dec 02 23:44:43 compute-1 sudo[205683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:43 compute-1 python3.9[205685]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:43 compute-1 sudo[205683]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:43 compute-1 sudo[205761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exhwtmdnryjsflmbwhzuaxowiptxvctj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719082.8670716-1853-84651708585602/AnsiballZ_file.py'
Dec 02 23:44:43 compute-1 sudo[205761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:43 compute-1 python3.9[205763]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:44 compute-1 sudo[205761]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:44 compute-1 sudo[205913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbfwinnfhifizqxsqjcaqtyzxkmlgikg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719084.2659547-1877-68714347365149/AnsiballZ_stat.py'
Dec 02 23:44:44 compute-1 sudo[205913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:44 compute-1 python3.9[205915]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:44 compute-1 sudo[205913]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:45 compute-1 sudo[205991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqjjuchfrcocixaudwlsiyyeseazgxqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719084.2659547-1877-68714347365149/AnsiballZ_file.py'
Dec 02 23:44:45 compute-1 sudo[205991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:45 compute-1 python3.9[205993]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:45 compute-1 sudo[205991]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:46 compute-1 sudo[206143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfifharhlbkolzqooetxuegxdvjovgou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719085.7728024-1901-252177287132019/AnsiballZ_stat.py'
Dec 02 23:44:46 compute-1 sudo[206143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:46 compute-1 python3.9[206145]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:46 compute-1 sudo[206143]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:46 compute-1 sudo[206268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxspcrnvzwerjpejaqdvhcvxtikiqgps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719085.7728024-1901-252177287132019/AnsiballZ_copy.py'
Dec 02 23:44:46 compute-1 sudo[206268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:47 compute-1 python3.9[206270]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764719085.7728024-1901-252177287132019/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:47 compute-1 sudo[206268]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:47 compute-1 sudo[206420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dirfrfjkfpwddzauxsjrkjtxwmwwqjqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719087.364935-1931-44229771072223/AnsiballZ_file.py'
Dec 02 23:44:47 compute-1 sudo[206420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:47 compute-1 python3.9[206422]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:47 compute-1 sudo[206420]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:48 compute-1 sudo[206572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbsqewvbmzgilbceczcljxniylsogsxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719088.1804225-1947-185292892611125/AnsiballZ_command.py'
Dec 02 23:44:48 compute-1 sudo[206572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:48 compute-1 python3.9[206574]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:48 compute-1 sudo[206572]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:49 compute-1 sudo[206740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjgcnkvyfpcoaxwebcxbgympssnzdcay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719089.0438046-1963-131864502279611/AnsiballZ_blockinfile.py'
Dec 02 23:44:49 compute-1 sudo[206740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:49 compute-1 podman[206701]: 2025-12-02 23:44:49.547207904 +0000 UTC m=+0.083188938 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Dec 02 23:44:49 compute-1 python3.9[206747]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:49 compute-1 sudo[206740]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:50 compute-1 sudo[206900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agayeeyhfxxsmlqxerypdhdlkimqtffd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719090.097255-1981-270419071166487/AnsiballZ_command.py'
Dec 02 23:44:50 compute-1 sudo[206900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:50 compute-1 python3.9[206902]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:50 compute-1 sudo[206900]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:51 compute-1 sudo[207053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjaqldssckpyqfslzmnwnyawlbwvleqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719091.0770156-1997-52185844212784/AnsiballZ_stat.py'
Dec 02 23:44:51 compute-1 sudo[207053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:51 compute-1 python3.9[207055]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:44:51 compute-1 sudo[207053]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:52 compute-1 sudo[207207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjjhejqhcgpwfhcoifyasqkwwgxmfuru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719091.9646366-2013-63508516836367/AnsiballZ_command.py'
Dec 02 23:44:52 compute-1 sudo[207207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:52 compute-1 python3.9[207209]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:52 compute-1 sudo[207207]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:53 compute-1 sudo[207362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvtvhsxmstuksakgfoclsgkvulqhyfax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719092.8401806-2029-213829639685485/AnsiballZ_file.py'
Dec 02 23:44:53 compute-1 sudo[207362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: ERROR   23:44:53 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: ERROR   23:44:53 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: ERROR   23:44:53 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: ERROR   23:44:53 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: ERROR   23:44:53 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:44:53 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:44:53 compute-1 python3.9[207364]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:53 compute-1 sudo[207362]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:53 compute-1 sshd-session[187500]: Connection closed by 192.168.122.30 port 42014
Dec 02 23:44:53 compute-1 sshd-session[187497]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:44:53 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Dec 02 23:44:53 compute-1 systemd[1]: session-26.scope: Consumed 1min 41.260s CPU time.
Dec 02 23:44:53 compute-1 systemd-logind[790]: Session 26 logged out. Waiting for processes to exit.
Dec 02 23:44:53 compute-1 systemd-logind[790]: Removed session 26.
Dec 02 23:44:55 compute-1 podman[207394]: 2025-12-02 23:44:55.251074096 +0000 UTC m=+0.089556893 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:45:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:45:01.679 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:45:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:45:01.679 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:45:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:45:01.679 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:45:05 compute-1 podman[207416]: 2025-12-02 23:45:05.248739559 +0000 UTC m=+0.078443654 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:45:05 compute-1 podman[197537]: time="2025-12-02T23:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:45:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:45:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2564 "" "Go-http-client/1.1"
Dec 02 23:45:07 compute-1 podman[207445]: 2025-12-02 23:45:07.285088136 +0000 UTC m=+0.125763417 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 23:45:11 compute-1 podman[207473]: 2025-12-02 23:45:11.253441493 +0000 UTC m=+0.080593926 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: ERROR   23:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: ERROR   23:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: ERROR   23:45:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: ERROR   23:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: ERROR   23:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:45:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:45:20 compute-1 podman[207493]: 2025-12-02 23:45:20.234664603 +0000 UTC m=+0.070868271 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.045 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.045 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.045 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.046 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.046 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.046 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.046 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.046 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.047 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.568 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.569 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.569 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.569 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.743 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.744 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.759 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.759 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6119MB free_disk=73.20013046264648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.760 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:45:24 compute-1 nova_compute[187157]: 2025-12-02 23:45:24.760 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:45:25 compute-1 nova_compute[187157]: 2025-12-02 23:45:25.814 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:45:25 compute-1 nova_compute[187157]: 2025-12-02 23:45:25.814 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:45:24 up 52 min,  0 user,  load average: 0.38, 0.66, 0.61\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:45:25 compute-1 nova_compute[187157]: 2025-12-02 23:45:25.994 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:45:26 compute-1 podman[207515]: 2025-12-02 23:45:26.250333239 +0000 UTC m=+0.083011192 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:45:26 compute-1 nova_compute[187157]: 2025-12-02 23:45:26.509 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:45:27 compute-1 nova_compute[187157]: 2025-12-02 23:45:27.021 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:45:27 compute-1 nova_compute[187157]: 2025-12-02 23:45:27.021 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.261s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:45:35 compute-1 podman[197537]: time="2025-12-02T23:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:45:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:45:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec 02 23:45:35 compute-1 podman[207536]: 2025-12-02 23:45:35.801065575 +0000 UTC m=+0.108388328 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:45:38 compute-1 podman[207561]: 2025-12-02 23:45:38.302050708 +0000 UTC m=+0.135310489 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:45:42 compute-1 podman[207588]: 2025-12-02 23:45:42.22238126 +0000 UTC m=+0.067275511 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: ERROR   23:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: ERROR   23:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: ERROR   23:45:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: ERROR   23:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: ERROR   23:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:45:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:45:51 compute-1 podman[207607]: 2025-12-02 23:45:51.270913235 +0000 UTC m=+0.099650685 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:45:57 compute-1 podman[207629]: 2025-12-02 23:45:57.270249656 +0000 UTC m=+0.108051789 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:46:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:46:01.681 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:46:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:46:01.682 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:46:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:46:01.682 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:46:05 compute-1 podman[197537]: time="2025-12-02T23:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:46:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:46:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2567 "" "Go-http-client/1.1"
Dec 02 23:46:06 compute-1 podman[207651]: 2025-12-02 23:46:06.265659247 +0000 UTC m=+0.087831120 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:46:09 compute-1 podman[207676]: 2025-12-02 23:46:09.314948535 +0000 UTC m=+0.149855842 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller)
Dec 02 23:46:13 compute-1 podman[207702]: 2025-12-02 23:46:13.248344952 +0000 UTC m=+0.085011511 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: ERROR   23:46:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: ERROR   23:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: ERROR   23:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: ERROR   23:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: ERROR   23:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:46:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:46:22 compute-1 podman[207722]: 2025-12-02 23:46:22.243889785 +0000 UTC m=+0.082992682 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:46:24 compute-1 nova_compute[187157]: 2025-12-02 23:46:24.672 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-1 nova_compute[187157]: 2025-12-02 23:46:24.673 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.183 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.183 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.184 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.184 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.184 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.184 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.185 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.185 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.703 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.704 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.704 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.704 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.908 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.910 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.923 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.924 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6168MB free_disk=73.20407104492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.924 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:46:25 compute-1 nova_compute[187157]: 2025-12-02 23:46:25.925 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:46:26 compute-1 nova_compute[187157]: 2025-12-02 23:46:26.980 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:46:26 compute-1 nova_compute[187157]: 2025-12-02 23:46:26.981 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:46:25 up 53 min,  0 user,  load average: 0.14, 0.53, 0.57\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:46:27 compute-1 nova_compute[187157]: 2025-12-02 23:46:27.006 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:46:27 compute-1 nova_compute[187157]: 2025-12-02 23:46:27.514 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:46:28 compute-1 nova_compute[187157]: 2025-12-02 23:46:28.026 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:46:28 compute-1 nova_compute[187157]: 2025-12-02 23:46:28.027 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:46:28 compute-1 podman[207745]: 2025-12-02 23:46:28.252919047 +0000 UTC m=+0.086904585 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:46:37 compute-1 podman[207765]: 2025-12-02 23:46:37.25729155 +0000 UTC m=+0.068135995 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:46:40 compute-1 podman[207790]: 2025-12-02 23:46:40.312361293 +0000 UTC m=+0.143678685 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:46:44 compute-1 podman[207816]: 2025-12-02 23:46:44.249973812 +0000 UTC m=+0.083646168 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 02 23:46:53 compute-1 podman[207835]: 2025-12-02 23:46:53.224837317 +0000 UTC m=+0.061748122 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350)
Dec 02 23:46:59 compute-1 podman[207857]: 2025-12-02 23:46:59.224561155 +0000 UTC m=+0.064059977 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Dec 02 23:47:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:01.683 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:01.683 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:01.683 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:05 compute-1 podman[197537]: time="2025-12-02T23:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:47:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:47:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2567 "" "Go-http-client/1.1"
Dec 02 23:47:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:07.448 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:47:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:07.450 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:47:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:07.454 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:47:08 compute-1 podman[207881]: 2025-12-02 23:47:08.252268688 +0000 UTC m=+0.080931302 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:47:11 compute-1 podman[207905]: 2025-12-02 23:47:11.322384753 +0000 UTC m=+0.158864590 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Dec 02 23:47:14 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:14.911 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:09:67 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22106c97f2524355a0bbadb98eaf5c22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13808bcd-e156-4466-92c4-34dec905a236, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=592edbb1-0110-4221-9f00-b76b4034be4b) old=Port_Binding(mac=['fa:16:3e:0d:09:67'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22106c97f2524355a0bbadb98eaf5c22', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:47:14 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:14.913 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 592edbb1-0110-4221-9f00-b76b4034be4b in datapath 17f7ebac-3ad4-4ff9-8a59-57891d829652 updated
Dec 02 23:47:14 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:14.914 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17f7ebac-3ad4-4ff9-8a59-57891d829652, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:47:14 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:14.915 104348 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpcranwgvp/privsep.sock']
Dec 02 23:47:15 compute-1 podman[207936]: 2025-12-02 23:47:15.255122054 +0000 UTC m=+0.086788562 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.744 104348 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.744 104348 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcranwgvp/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.583 207957 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.590 207957 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.593 207957 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.593 207957 INFO oslo.privsep.daemon [-] privsep daemon running as pid 207957
Dec 02 23:47:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:15.747 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bf17b54c-cdf2-4aeb-8829-a58ae73fe2bf]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:47:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:16.203 207957 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:16.203 207957 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:16.203 207957 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:16.611 207957 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 02 23:47:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:16.616 207957 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 02 23:47:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:47:16.650 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe811bf-9bd7-4947-94e6-f1190fb851d8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:47:17 compute-1 nova_compute[187157]: 2025-12-02 23:47:17.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:17 compute-1 nova_compute[187157]: 2025-12-02 23:47:17.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 02 23:47:18 compute-1 nova_compute[187157]: 2025-12-02 23:47:18.208 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 02 23:47:18 compute-1 nova_compute[187157]: 2025-12-02 23:47:18.209 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:18 compute-1 nova_compute[187157]: 2025-12-02 23:47:18.209 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 02 23:47:18 compute-1 nova_compute[187157]: 2025-12-02 23:47:18.716 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: ERROR   23:47:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: ERROR   23:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: ERROR   23:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: ERROR   23:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: ERROR   23:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:47:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.219 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.220 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.220 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.220 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.221 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.221 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.221 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.222 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.744 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.747 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.747 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:23 compute-1 nova_compute[187157]: 2025-12-02 23:47:23.748 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:47:24 compute-1 nova_compute[187157]: 2025-12-02 23:47:24.005 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:47:24 compute-1 nova_compute[187157]: 2025-12-02 23:47:24.007 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:47:24 compute-1 nova_compute[187157]: 2025-12-02 23:47:24.034 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:47:24 compute-1 nova_compute[187157]: 2025-12-02 23:47:24.035 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6057MB free_disk=73.20407104492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:47:24 compute-1 nova_compute[187157]: 2025-12-02 23:47:24.036 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:24 compute-1 nova_compute[187157]: 2025-12-02 23:47:24.037 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:24 compute-1 podman[207963]: 2025-12-02 23:47:24.260819029 +0000 UTC m=+0.092593081 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter)
Dec 02 23:47:25 compute-1 nova_compute[187157]: 2025-12-02 23:47:25.089 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:47:25 compute-1 nova_compute[187157]: 2025-12-02 23:47:25.091 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:47:24 up 54 min,  0 user,  load average: 0.12, 0.45, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:47:25 compute-1 nova_compute[187157]: 2025-12-02 23:47:25.127 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:47:25 compute-1 nova_compute[187157]: 2025-12-02 23:47:25.636 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:47:26 compute-1 nova_compute[187157]: 2025-12-02 23:47:26.147 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:47:26 compute-1 nova_compute[187157]: 2025-12-02 23:47:26.148 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:28 compute-1 nova_compute[187157]: 2025-12-02 23:47:28.628 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:30 compute-1 podman[207984]: 2025-12-02 23:47:30.247214529 +0000 UTC m=+0.080026911 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 23:47:34 compute-1 sshd-session[208006]: Invalid user validator from 193.32.162.146 port 45858
Dec 02 23:47:34 compute-1 sshd-session[208006]: Connection closed by invalid user validator 193.32.162.146 port 45858 [preauth]
Dec 02 23:47:35 compute-1 podman[197537]: time="2025-12-02T23:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:47:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:47:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec 02 23:47:39 compute-1 podman[208008]: 2025-12-02 23:47:39.250720809 +0000 UTC m=+0.089599511 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:47:42 compute-1 podman[208034]: 2025-12-02 23:47:42.319781441 +0000 UTC m=+0.147790838 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest)
Dec 02 23:47:46 compute-1 podman[208060]: 2025-12-02 23:47:46.235145239 +0000 UTC m=+0.068232165 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: ERROR   23:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: ERROR   23:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: ERROR   23:47:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: ERROR   23:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: ERROR   23:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:47:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:47:55 compute-1 podman[208083]: 2025-12-02 23:47:55.257611412 +0000 UTC m=+0.092749636 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Dec 02 23:48:01 compute-1 podman[208105]: 2025-12-02 23:48:01.265962854 +0000 UTC m=+0.096745370 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 02 23:48:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:48:01.685 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:48:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:48:01.685 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:48:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:48:01.685 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:48:05 compute-1 podman[197537]: time="2025-12-02T23:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:48:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:48:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Dec 02 23:48:10 compute-1 podman[208127]: 2025-12-02 23:48:10.255250523 +0000 UTC m=+0.082970594 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:48:13 compute-1 podman[208151]: 2025-12-02 23:48:13.286354047 +0000 UTC m=+0.123281868 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 02 23:48:17 compute-1 podman[208178]: 2025-12-02 23:48:17.242353506 +0000 UTC m=+0.073863059 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: ERROR   23:48:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: ERROR   23:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: ERROR   23:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: ERROR   23:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: ERROR   23:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:48:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:48:22 compute-1 nova_compute[187157]: 2025-12-02 23:48:22.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:22 compute-1 nova_compute[187157]: 2025-12-02 23:48:22.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:22 compute-1 nova_compute[187157]: 2025-12-02 23:48:22.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:22 compute-1 nova_compute[187157]: 2025-12-02 23:48:22.703 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:48:22 compute-1 nova_compute[187157]: 2025-12-02 23:48:22.703 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.225 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.226 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.226 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.226 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.450 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.452 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.475 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.476 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6072MB free_disk=73.2040901184082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.477 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:48:23 compute-1 nova_compute[187157]: 2025-12-02 23:48:23.477 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.599 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.599 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:48:23 up 55 min,  0 user,  load average: 0.04, 0.37, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.663 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.722 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.722 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.745 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.781 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 02 23:48:24 compute-1 nova_compute[187157]: 2025-12-02 23:48:24.810 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:48:25 compute-1 nova_compute[187157]: 2025-12-02 23:48:25.325 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:48:25 compute-1 nova_compute[187157]: 2025-12-02 23:48:25.834 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:48:25 compute-1 nova_compute[187157]: 2025-12-02 23:48:25.834 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.357s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:48:26 compute-1 podman[208198]: 2025-12-02 23:48:26.256690727 +0000 UTC m=+0.091136887 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:48:26 compute-1 nova_compute[187157]: 2025-12-02 23:48:26.829 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:26 compute-1 nova_compute[187157]: 2025-12-02 23:48:26.830 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:27 compute-1 nova_compute[187157]: 2025-12-02 23:48:27.343 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:27 compute-1 nova_compute[187157]: 2025-12-02 23:48:27.344 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:27 compute-1 nova_compute[187157]: 2025-12-02 23:48:27.344 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:32 compute-1 podman[208220]: 2025-12-02 23:48:32.243177892 +0000 UTC m=+0.071456391 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 02 23:48:35 compute-1 podman[197537]: time="2025-12-02T23:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:48:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:48:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2568 "" "Go-http-client/1.1"
Dec 02 23:48:41 compute-1 podman[208241]: 2025-12-02 23:48:41.244985985 +0000 UTC m=+0.072749150 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:48:44 compute-1 podman[208265]: 2025-12-02 23:48:44.311846128 +0000 UTC m=+0.140153731 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:48:48 compute-1 podman[208292]: 2025-12-02 23:48:48.242849838 +0000 UTC m=+0.074072992 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: ERROR   23:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: ERROR   23:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: ERROR   23:48:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: ERROR   23:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: ERROR   23:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:48:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:48:57 compute-1 podman[208313]: 2025-12-02 23:48:57.229214862 +0000 UTC m=+0.065583827 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec 02 23:49:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:49:01.686 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:49:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:49:01.687 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:49:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:49:01.687 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:49:03 compute-1 podman[208335]: 2025-12-02 23:49:03.266965658 +0000 UTC m=+0.109333259 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Dec 02 23:49:05 compute-1 podman[197537]: time="2025-12-02T23:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:49:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:49:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Dec 02 23:49:12 compute-1 podman[208356]: 2025-12-02 23:49:12.265795092 +0000 UTC m=+0.089204206 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:49:15 compute-1 podman[208381]: 2025-12-02 23:49:15.319286024 +0000 UTC m=+0.150252635 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:49:19 compute-1 podman[208408]: 2025-12-02 23:49:19.260219821 +0000 UTC m=+0.091292065 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: ERROR   23:49:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: ERROR   23:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: ERROR   23:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: ERROR   23:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: ERROR   23:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:49:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:49:22 compute-1 nova_compute[187157]: 2025-12-02 23:49:22.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:23 compute-1 nova_compute[187157]: 2025-12-02 23:49:23.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:23 compute-1 nova_compute[187157]: 2025-12-02 23:49:23.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:49:23 compute-1 nova_compute[187157]: 2025-12-02 23:49:23.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.222 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.223 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.223 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.224 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.378 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.379 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.399 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.399 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6086MB free_disk=73.2015151977539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.400 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:49:24 compute-1 nova_compute[187157]: 2025-12-02 23:49:24.400 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:49:25 compute-1 nova_compute[187157]: 2025-12-02 23:49:25.465 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:49:25 compute-1 nova_compute[187157]: 2025-12-02 23:49:25.466 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:49:24 up 56 min,  0 user,  load average: 0.08, 0.31, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:49:25 compute-1 nova_compute[187157]: 2025-12-02 23:49:25.492 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:49:26 compute-1 nova_compute[187157]: 2025-12-02 23:49:26.001 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:49:26 compute-1 nova_compute[187157]: 2025-12-02 23:49:26.513 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:49:26 compute-1 nova_compute[187157]: 2025-12-02 23:49:26.514 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:49:27 compute-1 nova_compute[187157]: 2025-12-02 23:49:27.509 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:27 compute-1 nova_compute[187157]: 2025-12-02 23:49:27.510 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:27 compute-1 nova_compute[187157]: 2025-12-02 23:49:27.510 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:27 compute-1 nova_compute[187157]: 2025-12-02 23:49:27.510 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:27 compute-1 nova_compute[187157]: 2025-12-02 23:49:27.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:28 compute-1 podman[208429]: 2025-12-02 23:49:28.248537672 +0000 UTC m=+0.088766136 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 02 23:49:34 compute-1 podman[208451]: 2025-12-02 23:49:34.275003707 +0000 UTC m=+0.105216201 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd)
Dec 02 23:49:35 compute-1 podman[197537]: time="2025-12-02T23:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:49:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:49:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Dec 02 23:49:43 compute-1 podman[208473]: 2025-12-02 23:49:43.281558648 +0000 UTC m=+0.106346038 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:49:43 compute-1 sshd-session[208471]: Invalid user 12345 from 185.156.73.233 port 22552
Dec 02 23:49:43 compute-1 sshd-session[208471]: Connection closed by invalid user 12345 185.156.73.233 port 22552 [preauth]
Dec 02 23:49:46 compute-1 podman[208498]: 2025-12-02 23:49:46.270181249 +0000 UTC m=+0.098562561 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: ERROR   23:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: ERROR   23:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: ERROR   23:49:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: ERROR   23:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: ERROR   23:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:49:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:49:50 compute-1 podman[208524]: 2025-12-02 23:49:50.265029973 +0000 UTC m=+0.096885560 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 02 23:49:59 compute-1 podman[208543]: 2025-12-02 23:49:59.243910638 +0000 UTC m=+0.078737104 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 23:50:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:01.688 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:01.689 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:01.689 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:05 compute-1 podman[208566]: 2025-12-02 23:50:05.255985165 +0000 UTC m=+0.093668983 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Dec 02 23:50:05 compute-1 podman[197537]: time="2025-12-02T23:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:50:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:50:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Dec 02 23:50:14 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:14.081 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:50:14 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:14.082 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:50:14 compute-1 podman[208587]: 2025-12-02 23:50:14.238561129 +0000 UTC m=+0.071554222 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:50:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:15.050 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:7d:48 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b76ae8150c43ac98862da676697b95', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d26e027-073c-46dd-95e9-4c77fc749b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7b376433-fabc-48f0-aa10-0098b8d1cf58) old=Port_Binding(mac=['fa:16:3e:9a:7d:48'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b76ae8150c43ac98862da676697b95', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:50:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:15.051 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7b376433-fabc-48f0-aa10-0098b8d1cf58 in datapath 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 updated
Dec 02 23:50:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:15.052 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:50:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:15.055 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[24e62969-71dd-43b7-a5db-e1ef52c2fdc5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:50:15 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:15.083 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:50:17 compute-1 podman[208611]: 2025-12-02 23:50:17.269413065 +0000 UTC m=+0.098376706 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: ERROR   23:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: ERROR   23:50:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: ERROR   23:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: ERROR   23:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: ERROR   23:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:50:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:50:21 compute-1 podman[208636]: 2025-12-02 23:50:21.255825547 +0000 UTC m=+0.084569684 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:50:22 compute-1 nova_compute[187157]: 2025-12-02 23:50:22.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:24 compute-1 nova_compute[187157]: 2025-12-02 23:50:24.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.222 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.223 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.223 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.224 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.425 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.426 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.456 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.457 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6082MB free_disk=73.20153427124023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.457 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:25 compute-1 nova_compute[187157]: 2025-12-02 23:50:25.458 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:26 compute-1 nova_compute[187157]: 2025-12-02 23:50:26.734 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:50:26 compute-1 nova_compute[187157]: 2025-12-02 23:50:26.734 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:50:25 up 57 min,  0 user,  load average: 0.03, 0.25, 0.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:50:26 compute-1 nova_compute[187157]: 2025-12-02 23:50:26.758 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:50:27 compute-1 nova_compute[187157]: 2025-12-02 23:50:27.266 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:50:27 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:27.644 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:fa:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aad1654ac0c43c38292ab72dec9fb3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=030d3419-d041-4d8e-8886-4428ecbcc3b5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=01730371-dcdb-43cc-98fa-0362fa6b15a8) old=Port_Binding(mac=['fa:16:3e:ea:fa:1b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aad1654ac0c43c38292ab72dec9fb3a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:50:27 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:27.645 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 01730371-dcdb-43cc-98fa-0362fa6b15a8 in datapath 0bd9450f-3ee4-4a30-8a14-b8735bd45c3e updated
Dec 02 23:50:27 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:27.646 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0bd9450f-3ee4-4a30-8a14-b8735bd45c3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:50:27 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:50:27.648 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee4ffb1-c453-414d-aa39-fbdcbb82316e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:50:27 compute-1 nova_compute[187157]: 2025-12-02 23:50:27.775 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:50:27 compute-1 nova_compute[187157]: 2025-12-02 23:50:27.776 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.318s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:28 compute-1 nova_compute[187157]: 2025-12-02 23:50:28.771 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:28 compute-1 nova_compute[187157]: 2025-12-02 23:50:28.772 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:28 compute-1 nova_compute[187157]: 2025-12-02 23:50:28.772 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:28 compute-1 nova_compute[187157]: 2025-12-02 23:50:28.772 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:28 compute-1 nova_compute[187157]: 2025-12-02 23:50:28.772 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:28 compute-1 nova_compute[187157]: 2025-12-02 23:50:28.773 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:50:29 compute-1 nova_compute[187157]: 2025-12-02 23:50:29.697 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:30 compute-1 nova_compute[187157]: 2025-12-02 23:50:30.208 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:30 compute-1 podman[208657]: 2025-12-02 23:50:30.237426057 +0000 UTC m=+0.074703607 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 23:50:35 compute-1 podman[197537]: time="2025-12-02T23:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:50:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:50:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Dec 02 23:50:36 compute-1 podman[208677]: 2025-12-02 23:50:36.255042421 +0000 UTC m=+0.086400879 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:50:45 compute-1 podman[208697]: 2025-12-02 23:50:45.262965327 +0000 UTC m=+0.091601698 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:50:48 compute-1 podman[208722]: 2025-12-02 23:50:48.326957261 +0000 UTC m=+0.164947567 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: ERROR   23:50:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: ERROR   23:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: ERROR   23:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: ERROR   23:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: ERROR   23:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:50:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:50:52 compute-1 podman[208749]: 2025-12-02 23:50:52.244242617 +0000 UTC m=+0.078605615 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:51:01 compute-1 podman[208768]: 2025-12-02 23:51:01.241134652 +0000 UTC m=+0.072289184 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Dec 02 23:51:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:01.690 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:01.690 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:01.691 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:05 compute-1 sshd-session[208791]: Invalid user node from 193.32.162.146 port 33142
Dec 02 23:51:05 compute-1 sshd-session[208791]: Connection closed by invalid user node 193.32.162.146 port 33142 [preauth]
Dec 02 23:51:05 compute-1 podman[197537]: time="2025-12-02T23:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:51:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:51:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Dec 02 23:51:07 compute-1 podman[208793]: 2025-12-02 23:51:07.269599794 +0000 UTC m=+0.103052661 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:51:16 compute-1 podman[208813]: 2025-12-02 23:51:16.223187511 +0000 UTC m=+0.058004532 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:51:19 compute-1 podman[208837]: 2025-12-02 23:51:19.271769825 +0000 UTC m=+0.108431270 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: ERROR   23:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: ERROR   23:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: ERROR   23:51:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: ERROR   23:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: ERROR   23:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:51:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:51:22 compute-1 nova_compute[187157]: 2025-12-02 23:51:22.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:23 compute-1 podman[208863]: 2025-12-02 23:51:23.241597961 +0000 UTC m=+0.072093040 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Dec 02 23:51:26 compute-1 nova_compute[187157]: 2025-12-02 23:51:26.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:26 compute-1 nova_compute[187157]: 2025-12-02 23:51:26.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.220 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.221 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.389 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.390 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.408 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.409 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6093MB free_disk=73.2015151977539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.409 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:27 compute-1 nova_compute[187157]: 2025-12-02 23:51:27.410 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:28 compute-1 nova_compute[187157]: 2025-12-02 23:51:28.471 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:51:28 compute-1 nova_compute[187157]: 2025-12-02 23:51:28.471 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:51:27 up 58 min,  0 user,  load average: 0.01, 0.20, 0.41\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:51:28 compute-1 nova_compute[187157]: 2025-12-02 23:51:28.508 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:51:29 compute-1 nova_compute[187157]: 2025-12-02 23:51:29.016 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:51:29 compute-1 nova_compute[187157]: 2025-12-02 23:51:29.526 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:51:29 compute-1 nova_compute[187157]: 2025-12-02 23:51:29.526 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:30 compute-1 nova_compute[187157]: 2025-12-02 23:51:30.522 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:30 compute-1 nova_compute[187157]: 2025-12-02 23:51:30.523 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:30 compute-1 nova_compute[187157]: 2025-12-02 23:51:30.523 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:30 compute-1 nova_compute[187157]: 2025-12-02 23:51:30.524 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:30 compute-1 nova_compute[187157]: 2025-12-02 23:51:30.524 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:51:31 compute-1 nova_compute[187157]: 2025-12-02 23:51:31.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:32 compute-1 podman[208885]: 2025-12-02 23:51:32.224622113 +0000 UTC m=+0.062502838 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal)
Dec 02 23:51:33 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:33.076 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:51:33 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:33.077 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:51:33 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:33.078 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:34 compute-1 sshd-session[208907]: Invalid user admin from 2.57.121.112 port 40456
Dec 02 23:51:34 compute-1 sshd-session[208907]: Received disconnect from 2.57.121.112 port 40456:11: Bye [preauth]
Dec 02 23:51:34 compute-1 sshd-session[208907]: Disconnected from invalid user admin 2.57.121.112 port 40456 [preauth]
Dec 02 23:51:35 compute-1 podman[197537]: time="2025-12-02T23:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:51:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:51:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Dec 02 23:51:38 compute-1 podman[208910]: 2025-12-02 23:51:38.267683334 +0000 UTC m=+0.097776754 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Dec 02 23:51:47 compute-1 podman[208932]: 2025-12-02 23:51:47.258517365 +0000 UTC m=+0.093478523 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: ERROR   23:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: ERROR   23:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: ERROR   23:51:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: ERROR   23:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: ERROR   23:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:51:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:51:50 compute-1 podman[208957]: 2025-12-02 23:51:50.309954998 +0000 UTC m=+0.146398620 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 02 23:51:54 compute-1 podman[208986]: 2025-12-02 23:51:54.251059525 +0000 UTC m=+0.081191168 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:51:54 compute-1 sshd-session[208984]: Received disconnect from 193.46.255.7 port 51412:11:  [preauth]
Dec 02 23:51:54 compute-1 sshd-session[208984]: Disconnected from authenticating user root 193.46.255.7 port 51412 [preauth]
Dec 02 23:51:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:55.422 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:40:c1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3957003f9ee8492688556ccb0cd5fdd5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f7b0db-e402-4898-a011-4ec4b4fff19a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=64808fc3-a961-45a4-b901-b4f1049f2d12) old=Port_Binding(mac=['fa:16:3e:56:40:c1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3957003f9ee8492688556ccb0cd5fdd5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:51:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:55.424 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 64808fc3-a961-45a4-b901-b4f1049f2d12 in datapath 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 updated
Dec 02 23:51:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:55.425 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:51:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:51:55.427 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[499b5d9f-7b37-409b-926b-023428877ca4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:01.692 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:01.692 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:01.692 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:03 compute-1 podman[209007]: 2025-12-02 23:52:03.22293121 +0000 UTC m=+0.059806196 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64)
Dec 02 23:52:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:03.925 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:5a:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f5fe24-1162-482d-92ac-517788927a8f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9c0c9072-9900-46a2-b66d-c5e8a478205d) old=Port_Binding(mac=['fa:16:3e:5c:5a:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:52:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:03.926 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9c0c9072-9900-46a2-b66d-c5e8a478205d in datapath 2ed5d5a2-c53b-47f2-94f0-955fc91515bf updated
Dec 02 23:52:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:03.927 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ed5d5a2-c53b-47f2-94f0-955fc91515bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:52:03 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:03.928 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f5b4ee-c302-47e3-bcfc-61b96dfc45b1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:05 compute-1 podman[197537]: time="2025-12-02T23:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:52:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:52:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2574 "" "Go-http-client/1.1"
Dec 02 23:52:09 compute-1 podman[209028]: 2025-12-02 23:52:09.255793347 +0000 UTC m=+0.086965047 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 02 23:52:18 compute-1 podman[209049]: 2025-12-02 23:52:18.220981891 +0000 UTC m=+0.061159248 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: ERROR   23:52:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: ERROR   23:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: ERROR   23:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: ERROR   23:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: ERROR   23:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:52:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:52:19 compute-1 nova_compute[187157]: 2025-12-02 23:52:19.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:21 compute-1 nova_compute[187157]: 2025-12-02 23:52:21.208 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:21 compute-1 nova_compute[187157]: 2025-12-02 23:52:21.209 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 02 23:52:21 compute-1 podman[209073]: 2025-12-02 23:52:21.280796425 +0000 UTC m=+0.123583324 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 23:52:25 compute-1 nova_compute[187157]: 2025-12-02 23:52:25.210 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:25 compute-1 podman[209100]: 2025-12-02 23:52:25.23600382 +0000 UTC m=+0.072370006 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:52:25 compute-1 nova_compute[187157]: 2025-12-02 23:52:25.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:25 compute-1 nova_compute[187157]: 2025-12-02 23:52:25.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 02 23:52:26 compute-1 nova_compute[187157]: 2025-12-02 23:52:26.206 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 02 23:52:27 compute-1 nova_compute[187157]: 2025-12-02 23:52:27.207 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:28 compute-1 nova_compute[187157]: 2025-12-02 23:52:28.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:28 compute-1 nova_compute[187157]: 2025-12-02 23:52:28.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:28 compute-1 nova_compute[187157]: 2025-12-02 23:52:28.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:28 compute-1 nova_compute[187157]: 2025-12-02 23:52:28.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:52:28 compute-1 nova_compute[187157]: 2025-12-02 23:52:28.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.219 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.220 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.220 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.221 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.484 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.486 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.515 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.517 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6086MB free_disk=73.2015151977539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.517 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:29 compute-1 nova_compute[187157]: 2025-12-02 23:52:29.518 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:31 compute-1 nova_compute[187157]: 2025-12-02 23:52:31.186 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:52:31 compute-1 nova_compute[187157]: 2025-12-02 23:52:31.187 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:52:29 up 59 min,  0 user,  load average: 0.00, 0.16, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:52:31 compute-1 nova_compute[187157]: 2025-12-02 23:52:31.259 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:52:31 compute-1 nova_compute[187157]: 2025-12-02 23:52:31.856 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:52:32 compute-1 nova_compute[187157]: 2025-12-02 23:52:32.467 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:52:32 compute-1 nova_compute[187157]: 2025-12-02 23:52:32.467 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.950s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:33 compute-1 nova_compute[187157]: 2025-12-02 23:52:33.464 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:33 compute-1 nova_compute[187157]: 2025-12-02 23:52:33.973 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:33 compute-1 nova_compute[187157]: 2025-12-02 23:52:33.973 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:34 compute-1 podman[209123]: 2025-12-02 23:52:34.240882506 +0000 UTC m=+0.073101623 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Dec 02 23:52:35 compute-1 podman[197537]: time="2025-12-02T23:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:52:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:52:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2575 "" "Go-http-client/1.1"
Dec 02 23:52:39 compute-1 nova_compute[187157]: 2025-12-02 23:52:39.372 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:39 compute-1 nova_compute[187157]: 2025-12-02 23:52:39.374 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:39 compute-1 nova_compute[187157]: 2025-12-02 23:52:39.881 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:52:40 compute-1 podman[209145]: 2025-12-02 23:52:40.25978885 +0000 UTC m=+0.090107422 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202)
Dec 02 23:52:40 compute-1 nova_compute[187157]: 2025-12-02 23:52:40.503 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:40 compute-1 nova_compute[187157]: 2025-12-02 23:52:40.504 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:40 compute-1 nova_compute[187157]: 2025-12-02 23:52:40.514 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:52:40 compute-1 nova_compute[187157]: 2025-12-02 23:52:40.515 187161 INFO nova.compute.claims [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Claim successful on node compute-1.ctlplane.example.com
Dec 02 23:52:41 compute-1 nova_compute[187157]: 2025-12-02 23:52:41.605 187161 DEBUG nova.compute.provider_tree [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:52:42 compute-1 nova_compute[187157]: 2025-12-02 23:52:42.116 187161 DEBUG nova.scheduler.client.report [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:52:42 compute-1 nova_compute[187157]: 2025-12-02 23:52:42.625 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:42 compute-1 nova_compute[187157]: 2025-12-02 23:52:42.626 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.158 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.160 187161 DEBUG nova.network.neutron [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.161 187161 WARNING neutronclient.v2_0.client [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.163 187161 WARNING neutronclient.v2_0.client [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.477 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.677 187161 INFO nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.987 187161 WARNING nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.988 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Triggering sync for uuid df034907-3b38-4357-af57-2750b437bf22 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Dec 02 23:52:43 compute-1 nova_compute[187157]: 2025-12-02 23:52:43.988 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:44 compute-1 nova_compute[187157]: 2025-12-02 23:52:44.186 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:52:45 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:45.072 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:52:45 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:45.074 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.209 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.212 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.212 187161 INFO nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Creating image(s)
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.214 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "/var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.214 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "/var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.216 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "/var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.216 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.217 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:45 compute-1 nova_compute[187157]: 2025-12-02 23:52:45.419 187161 DEBUG nova.network.neutron [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Successfully created port: 81ed667e-774b-42ce-8879-7225f57db1d7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.491 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.498 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.499 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.581 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.583 187161 DEBUG nova.virt.images [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] 92e79321-71af-44a0-869c-1d5a9da5fefc was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.584 187161 DEBUG nova.privsep.utils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.585 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.798 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted" returned: 0 in 0.212s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.806 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.860 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.862 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.644s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.862 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.867 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:46 compute-1 nova_compute[187157]: 2025-12-02 23:52:46.868 187161 INFO oslo.privsep.daemon [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpbgs43ze3/privsep.sock']
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.008 187161 DEBUG nova.network.neutron [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Successfully updated port: 81ed667e-774b-42ce-8879-7225f57db1d7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.050 187161 DEBUG nova.compute.manager [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-changed-81ed667e-774b-42ce-8879-7225f57db1d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.050 187161 DEBUG nova.compute.manager [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Refreshing instance network info cache due to event network-changed-81ed667e-774b-42ce-8879-7225f57db1d7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.051 187161 DEBUG oslo_concurrency.lockutils [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-df034907-3b38-4357-af57-2750b437bf22" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.051 187161 DEBUG oslo_concurrency.lockutils [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-df034907-3b38-4357-af57-2750b437bf22" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.051 187161 DEBUG nova.network.neutron [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Refreshing network info cache for port 81ed667e-774b-42ce-8879-7225f57db1d7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:52:47 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:47.075 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.514 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "refresh_cache-df034907-3b38-4357-af57-2750b437bf22" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.557 187161 WARNING neutronclient.v2_0.client [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.637 187161 INFO oslo.privsep.daemon [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Spawned new privsep daemon via rootwrap
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.457 209187 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.462 209187 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.464 209187 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.465 209187 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209187
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.757 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.824 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.825 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.827 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.828 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.834 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.835 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.891 187161 DEBUG nova.network.neutron [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.927 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.928 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.971 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.973 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:47 compute-1 nova_compute[187157]: 2025-12-02 23:52:47.974 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.027 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.028 187161 DEBUG nova.virt.disk.api [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Checking if we can resize image /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.029 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.079 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.080 187161 DEBUG nova.virt.disk.api [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Cannot resize image /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.081 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.082 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Ensure instance console log exists: /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.083 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.083 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.083 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.161 187161 DEBUG nova.network.neutron [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.670 187161 DEBUG oslo_concurrency.lockutils [req-b27b2dff-e6ab-40b6-916d-ef5316439f18 req-fee41124-9806-48ca-bdc1-2b8f772ca487 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-df034907-3b38-4357-af57-2750b437bf22" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.671 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquired lock "refresh_cache-df034907-3b38-4357-af57-2750b437bf22" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:52:48 compute-1 nova_compute[187157]: 2025-12-02 23:52:48.672 187161 DEBUG nova.network.neutron [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:52:49 compute-1 podman[209204]: 2025-12-02 23:52:49.270937305 +0000 UTC m=+0.101532626 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:52:49 compute-1 nova_compute[187157]: 2025-12-02 23:52:49.313 187161 DEBUG nova.network.neutron [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: ERROR   23:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: ERROR   23:52:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: ERROR   23:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: ERROR   23:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: ERROR   23:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:52:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:52:49 compute-1 nova_compute[187157]: 2025-12-02 23:52:49.682 187161 WARNING neutronclient.v2_0.client [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.148 187161 DEBUG nova.network.neutron [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Updating instance_info_cache with network_info: [{"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.656 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Releasing lock "refresh_cache-df034907-3b38-4357-af57-2750b437bf22" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.657 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Instance network_info: |[{"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.662 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Start _get_guest_xml network_info=[{"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.669 187161 WARNING nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.672 187161 DEBUG nova.virt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-328293485', uuid='df034907-3b38-4357-af57-2750b437bf22'), owner=OwnerMeta(userid='d032790eea2c4094b69ea4a2576bff68', username='tempest-TestDataModel-1253061916-project-admin', projectid='059ee4b8b9ab47ffbc539c03339a4112', projectname='tempest-TestDataModel-1253061916'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719570.672113) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.677 187161 DEBUG nova.virt.libvirt.host [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.678 187161 DEBUG nova.virt.libvirt.host [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.684 187161 DEBUG nova.virt.libvirt.host [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.684 187161 DEBUG nova.virt.libvirt.host [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.687 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.688 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.688 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.689 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.689 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.690 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.690 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.690 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.691 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.691 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.692 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.692 187161 DEBUG nova.virt.hardware [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.698 187161 DEBUG nova.privsep.utils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.700 187161 DEBUG nova.virt.libvirt.vif [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-328293485',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-328293485',id=3,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059ee4b8b9ab47ffbc539c03339a4112',ramdisk_id='',reservation_id='r-d9nchw4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1253061916',owner_user_name='tempest-TestDataModel-1253061916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:52:44Z,user_data=None,user_id='d032790eea2c4094b69ea4a2576bff68',uuid=df034907-3b38-4357-af57-2750b437bf22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.700 187161 DEBUG nova.network.os_vif_util [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converting VIF {"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.702 187161 DEBUG nova.network.os_vif_util [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:52:50 compute-1 nova_compute[187157]: 2025-12-02 23:52:50.704 187161 DEBUG nova.objects.instance [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lazy-loading 'pci_devices' on Instance uuid df034907-3b38-4357-af57-2750b437bf22 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.216 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <uuid>df034907-3b38-4357-af57-2750b437bf22</uuid>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <name>instance-00000003</name>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <metadata>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:name>tempest-TestDataModel-server-328293485</nova:name>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-02 23:52:50</nova:creationTime>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 02 23:52:51 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:properties>
Dec 02 23:52:51 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         </nova:properties>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       </nova:image>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:owner>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:user uuid="d032790eea2c4094b69ea4a2576bff68">tempest-TestDataModel-1253061916-project-admin</nova:user>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:project uuid="059ee4b8b9ab47ffbc539c03339a4112">tempest-TestDataModel-1253061916</nova:project>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       </nova:owner>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <nova:ports>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         <nova:port uuid="81ed667e-774b-42ce-8879-7225f57db1d7">
Dec 02 23:52:51 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:         </nova:port>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       </nova:ports>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </nova:instance>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </metadata>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <system>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <entry name="serial">df034907-3b38-4357-af57-2750b437bf22</entry>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <entry name="uuid">df034907-3b38-4357-af57-2750b437bf22</entry>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </system>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </sysinfo>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <os>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </os>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <features>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <acpi/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <apic/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </features>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </clock>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.config"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:19:e5:22"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <target dev="tap81ed667e-77"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/console.log" append="off"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </serial>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <video>
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </video>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:52:51 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 02 23:52:51 compute-1 nova_compute[187157]:     </memballoon>
Dec 02 23:52:51 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:52:51 compute-1 nova_compute[187157]: </domain>
Dec 02 23:52:51 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.218 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Preparing to wait for external event network-vif-plugged-81ed667e-774b-42ce-8879-7225f57db1d7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.218 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.219 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.219 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.220 187161 DEBUG nova.virt.libvirt.vif [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-328293485',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-328293485',id=3,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059ee4b8b9ab47ffbc539c03339a4112',ramdisk_id='',reservation_id='r-d9nchw4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1253061916',owner_user_name='tempest-TestDataModel-1253061916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:52:44Z,user_data=None,user_id='d032790eea2c4094b69ea4a2576bff68',uuid=df034907-3b38-4357-af57-2750b437bf22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.220 187161 DEBUG nova.network.os_vif_util [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converting VIF {"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.221 187161 DEBUG nova.network.os_vif_util [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.222 187161 DEBUG os_vif [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.317 187161 DEBUG ovsdbapp.backend.ovs_idl [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.317 187161 DEBUG ovsdbapp.backend.ovs_idl [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.318 187161 DEBUG ovsdbapp.backend.ovs_idl [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.319 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.319 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.320 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.321 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.323 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.327 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.339 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.340 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.340 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.342 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.342 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8b0f2113-7bf9-53d0-b71a-6ead12fc8da7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.344 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.345 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:51 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.348 187161 INFO oslo.privsep.daemon [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpwia6bgid/privsep.sock']
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.107 187161 INFO oslo.privsep.daemon [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Spawned new privsep daemon via rootwrap
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.945 209232 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.949 209232 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.951 209232 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:51.951 209232 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209232
Dec 02 23:52:52 compute-1 podman[209234]: 2025-12-02 23:52:52.280301379 +0000 UTC m=+0.113085213 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.350 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.351 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81ed667e-77, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.351 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap81ed667e-77, col_values=(('qos', UUID('e24e4e8a-bc9c-42f0-b62d-f559098bfd5b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.352 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap81ed667e-77, col_values=(('external_ids', {'iface-id': '81ed667e-774b-42ce-8879-7225f57db1d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:e5:22', 'vm-uuid': 'df034907-3b38-4357-af57-2750b437bf22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.399 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:52 compute-1 NetworkManager[55553]: <info>  [1764719572.4005] manager: (tap81ed667e-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.403 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.408 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:52 compute-1 nova_compute[187157]: 2025-12-02 23:52:52.410 187161 INFO os_vif [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77')
Dec 02 23:52:53 compute-1 nova_compute[187157]: 2025-12-02 23:52:53.964 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:52:53 compute-1 nova_compute[187157]: 2025-12-02 23:52:53.966 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:52:53 compute-1 nova_compute[187157]: 2025-12-02 23:52:53.967 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] No VIF found with MAC fa:16:3e:19:e5:22, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:52:53 compute-1 nova_compute[187157]: 2025-12-02 23:52:53.968 187161 INFO nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Using config drive
Dec 02 23:52:54 compute-1 nova_compute[187157]: 2025-12-02 23:52:54.481 187161 WARNING neutronclient.v2_0.client [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.130 187161 INFO nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Creating config drive at /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.config
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.140 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpxf3bfrbv execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.286 187161 DEBUG oslo_concurrency.processutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpxf3bfrbv" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:55 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 02 23:52:55 compute-1 kernel: tap81ed667e-77: entered promiscuous mode
Dec 02 23:52:55 compute-1 NetworkManager[55553]: <info>  [1764719575.4255] manager: (tap81ed667e-77): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec 02 23:52:55 compute-1 ovn_controller[95464]: 2025-12-02T23:52:55Z|00039|binding|INFO|Claiming lport 81ed667e-774b-42ce-8879-7225f57db1d7 for this chassis.
Dec 02 23:52:55 compute-1 ovn_controller[95464]: 2025-12-02T23:52:55Z|00040|binding|INFO|81ed667e-774b-42ce-8879-7225f57db1d7: Claiming fa:16:3e:19:e5:22 10.100.0.6
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.426 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.436 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.453 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:e5:22 10.100.0.6'], port_security=['fa:16:3e:19:e5:22 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'df034907-3b38-4357-af57-2750b437bf22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9f795f84-34cd-4429-87ae-cc3f6a2442e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f7b0db-e402-4898-a011-4ec4b4fff19a, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=81ed667e-774b-42ce-8879-7225f57db1d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.454 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 81ed667e-774b-42ce-8879-7225f57db1d7 in datapath 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 bound to our chassis
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.455 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c29168d-89f5-4fdd-a1dd-76c0a34cef80
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.493 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0150dd-a6d6-47c0-b80b-adf7c85c50b0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.495 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2c29168d-81 in ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.501 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2c29168d-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.502 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a1757055-9eb6-4c5e-8b7e-256a81aefbec]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.503 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[497aa5b4-0941-4fa6-b596-f8a18045f941]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:55 compute-1 systemd-udevd[209304]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.526 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad1e422-3252-4590-b31b-d6afda10c6ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:55 compute-1 NetworkManager[55553]: <info>  [1764719575.5303] device (tap81ed667e-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:52:55 compute-1 NetworkManager[55553]: <info>  [1764719575.5316] device (tap81ed667e-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.538 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:55 compute-1 ovn_controller[95464]: 2025-12-02T23:52:55Z|00041|binding|INFO|Setting lport 81ed667e-774b-42ce-8879-7225f57db1d7 ovn-installed in OVS
Dec 02 23:52:55 compute-1 ovn_controller[95464]: 2025-12-02T23:52:55Z|00042|binding|INFO|Setting lport 81ed667e-774b-42ce-8879-7225f57db1d7 up in Southbound
Dec 02 23:52:55 compute-1 nova_compute[187157]: 2025-12-02 23:52:55.545 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.548 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5efa94b0-ea14-4590-bb7c-cd27515d9b82]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:55 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:55.551 104348 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp6cfpo9_0/privsep.sock']
Dec 02 23:52:55 compute-1 podman[209274]: 2025-12-02 23:52:55.561415285 +0000 UTC m=+0.164515838 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:52:55 compute-1 systemd-machined[153454]: New machine qemu-1-instance-00000003.
Dec 02 23:52:55 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.208 187161 DEBUG nova.compute.manager [req-83d758e1-c58b-408c-84bc-2882ea3592d3 req-d126e784-61e7-49df-a15f-ead7a5d4464f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-plugged-81ed667e-774b-42ce-8879-7225f57db1d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.208 187161 DEBUG oslo_concurrency.lockutils [req-83d758e1-c58b-408c-84bc-2882ea3592d3 req-d126e784-61e7-49df-a15f-ead7a5d4464f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.209 187161 DEBUG oslo_concurrency.lockutils [req-83d758e1-c58b-408c-84bc-2882ea3592d3 req-d126e784-61e7-49df-a15f-ead7a5d4464f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.209 187161 DEBUG oslo_concurrency.lockutils [req-83d758e1-c58b-408c-84bc-2882ea3592d3 req-d126e784-61e7-49df-a15f-ead7a5d4464f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.210 187161 DEBUG nova.compute.manager [req-83d758e1-c58b-408c-84bc-2882ea3592d3 req-d126e784-61e7-49df-a15f-ead7a5d4464f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Processing event network-vif-plugged-81ed667e-774b-42ce-8879-7225f57db1d7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.211 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.216 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.230 187161 INFO nova.virt.libvirt.driver [-] [instance: df034907-3b38-4357-af57-2750b437bf22] Instance spawned successfully.
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.231 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.330 104348 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.331 104348 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6cfpo9_0/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.144 209338 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.148 209338 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.150 209338 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.151 209338 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209338
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.334 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[48223234-b409-49d2-bfe4-48da72088412]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.348 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.747 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.748 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.748 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.749 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.750 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:56 compute-1 nova_compute[187157]: 2025-12-02 23:52:56.751 187161 DEBUG nova.virt.libvirt.driver [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.846 209338 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.846 209338 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:56 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:56.846 209338 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.261 187161 INFO nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Took 12.05 seconds to spawn the instance on the hypervisor.
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.263 187161 DEBUG nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.341 209338 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.347 209338 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.400 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.430 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ed09874c-9ce4-4522-8960-40cfd6944a66]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 systemd-udevd[209308]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.452 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0084642f-451d-4739-a822-81ae576618f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 NetworkManager[55553]: <info>  [1764719577.4543] manager: (tap2c29168d-80): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.496 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb6cb9e-f3f5-42fd-ad5e-d0ac74b6b52c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.500 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[09d2d844-0568-4961-820e-ea3a7dd00c36]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 NetworkManager[55553]: <info>  [1764719577.5272] device (tap2c29168d-80): carrier: link connected
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.535 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea8eba0-818d-4402-9fc1-54bf183c36ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.561 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb8277f-b0a0-4391-b812-bcf23afc3a2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c29168d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:40:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359718, 'reachable_time': 32979, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209361, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.580 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[77246d8a-7fe7-47c5-9de4-321c3d6c7196]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe56:40c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359718, 'tstamp': 359718}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209362, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.600 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[811d43a1-1453-4cef-bab8-e092e1096d3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c29168d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:40:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359718, 'reachable_time': 32979, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209363, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.640 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7cebbc23-5d13-4b49-80f1-0e382cfb0aac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.745 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[da4eb9e5-d6be-4917-ab1d-105a122ccf2a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.747 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c29168d-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.747 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.748 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c29168d-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.750 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:57 compute-1 NetworkManager[55553]: <info>  [1764719577.7510] manager: (tap2c29168d-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec 02 23:52:57 compute-1 kernel: tap2c29168d-80: entered promiscuous mode
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.752 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c29168d-80, col_values=(('external_ids', {'iface-id': '64808fc3-a961-45a4-b901-b4f1049f2d12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:57 compute-1 ovn_controller[95464]: 2025-12-02T23:52:57Z|00043|binding|INFO|Releasing lport 64808fc3-a961-45a4-b901-b4f1049f2d12 from this chassis (sb_readonly=0)
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.756 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.758 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7d0b0b-e36c-4068-90c1-b67e69f25933]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.758 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.759 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.759 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.759 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.760 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[86eb5986-d491-4a01-8793-1b85f3c2fb95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.761 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.761 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f25e54-19b5-4cb0-bf0e-f86f052353fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.761 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: global
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-2c29168d-89f5-4fdd-a1dd-76c0a34cef80
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: defaults
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     log global
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID 2c29168d-89f5-4fdd-a1dd-76c0a34cef80
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:52:57 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:52:57.763 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'env', 'PROCESS_TAG=haproxy-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.769 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:57 compute-1 nova_compute[187157]: 2025-12-02 23:52:57.798 187161 INFO nova.compute.manager [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Took 17.41 seconds to build instance.
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.279 187161 DEBUG nova.compute.manager [req-0ac9c62f-262d-4a3f-887a-be1a42ce277f req-f41c57e7-8492-4fcf-b4d3-1ace68fc0488 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-plugged-81ed667e-774b-42ce-8879-7225f57db1d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.280 187161 DEBUG oslo_concurrency.lockutils [req-0ac9c62f-262d-4a3f-887a-be1a42ce277f req-f41c57e7-8492-4fcf-b4d3-1ace68fc0488 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.281 187161 DEBUG oslo_concurrency.lockutils [req-0ac9c62f-262d-4a3f-887a-be1a42ce277f req-f41c57e7-8492-4fcf-b4d3-1ace68fc0488 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.281 187161 DEBUG oslo_concurrency.lockutils [req-0ac9c62f-262d-4a3f-887a-be1a42ce277f req-f41c57e7-8492-4fcf-b4d3-1ace68fc0488 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.281 187161 DEBUG nova.compute.manager [req-0ac9c62f-262d-4a3f-887a-be1a42ce277f req-f41c57e7-8492-4fcf-b4d3-1ace68fc0488 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] No waiting events found dispatching network-vif-plugged-81ed667e-774b-42ce-8879-7225f57db1d7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.281 187161 WARNING nova.compute.manager [req-0ac9c62f-262d-4a3f-887a-be1a42ce277f req-f41c57e7-8492-4fcf-b4d3-1ace68fc0488 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received unexpected event network-vif-plugged-81ed667e-774b-42ce-8879-7225f57db1d7 for instance with vm_state active and task_state None.
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.303 187161 DEBUG oslo_concurrency.lockutils [None req-121aeabc-2db2-4375-b778-504137bc71c6 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.929s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.304 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "df034907-3b38-4357-af57-2750b437bf22" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 14.316s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.304 187161 INFO nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: df034907-3b38-4357-af57-2750b437bf22] During sync_power_state the instance has a pending task (networking). Skip.
Dec 02 23:52:58 compute-1 nova_compute[187157]: 2025-12-02 23:52:58.305 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "df034907-3b38-4357-af57-2750b437bf22" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:58 compute-1 podman[209396]: 2025-12-02 23:52:58.337583032 +0000 UTC m=+0.083997761 container create 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:52:58 compute-1 podman[209396]: 2025-12-02 23:52:58.296959015 +0000 UTC m=+0.043373784 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:52:58 compute-1 systemd[1]: Started libpod-conmon-5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1.scope.
Dec 02 23:52:58 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:52:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4570458e55bbe48d8bbb35f337de4780aa677ee19cc22466b3b068d4d0bfc31e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:52:58 compute-1 podman[209396]: 2025-12-02 23:52:58.468225825 +0000 UTC m=+0.214640604 container init 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 23:52:58 compute-1 podman[209396]: 2025-12-02 23:52:58.478052901 +0000 UTC m=+0.224467590 container start 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 02 23:52:58 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [NOTICE]   (209415) : New worker (209417) forked
Dec 02 23:52:58 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [NOTICE]   (209415) : Loading success.
Dec 02 23:53:01 compute-1 nova_compute[187157]: 2025-12-02 23:53:01.351 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:01.693 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:01.694 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:01.695 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:02 compute-1 nova_compute[187157]: 2025-12-02 23:53:02.402 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:05 compute-1 podman[209427]: 2025-12-02 23:53:05.272868429 +0000 UTC m=+0.099540755 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Dec 02 23:53:05 compute-1 podman[197537]: time="2025-12-02T23:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:53:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:53:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3045 "" "Go-http-client/1.1"
Dec 02 23:53:06 compute-1 nova_compute[187157]: 2025-12-02 23:53:06.397 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:06 compute-1 nova_compute[187157]: 2025-12-02 23:53:06.737 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:06 compute-1 nova_compute[187157]: 2025-12-02 23:53:06.738 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:06 compute-1 nova_compute[187157]: 2025-12-02 23:53:06.738 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:06 compute-1 nova_compute[187157]: 2025-12-02 23:53:06.738 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:06 compute-1 nova_compute[187157]: 2025-12-02 23:53:06.739 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:07 compute-1 nova_compute[187157]: 2025-12-02 23:53:07.235 187161 INFO nova.compute.manager [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Terminating instance
Dec 02 23:53:07 compute-1 nova_compute[187157]: 2025-12-02 23:53:07.405 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:07 compute-1 nova_compute[187157]: 2025-12-02 23:53:07.876 187161 DEBUG nova.compute.manager [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:53:07 compute-1 kernel: tap81ed667e-77 (unregistering): left promiscuous mode
Dec 02 23:53:07 compute-1 NetworkManager[55553]: <info>  [1764719587.9134] device (tap81ed667e-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:53:07 compute-1 nova_compute[187157]: 2025-12-02 23:53:07.923 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:07 compute-1 ovn_controller[95464]: 2025-12-02T23:53:07Z|00044|binding|INFO|Releasing lport 81ed667e-774b-42ce-8879-7225f57db1d7 from this chassis (sb_readonly=0)
Dec 02 23:53:07 compute-1 ovn_controller[95464]: 2025-12-02T23:53:07Z|00045|binding|INFO|Setting lport 81ed667e-774b-42ce-8879-7225f57db1d7 down in Southbound
Dec 02 23:53:07 compute-1 ovn_controller[95464]: 2025-12-02T23:53:07Z|00046|binding|INFO|Removing iface tap81ed667e-77 ovn-installed in OVS
Dec 02 23:53:07 compute-1 nova_compute[187157]: 2025-12-02 23:53:07.926 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:07.952 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:e5:22 10.100.0.6'], port_security=['fa:16:3e:19:e5:22 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'df034907-3b38-4357-af57-2750b437bf22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9f795f84-34cd-4429-87ae-cc3f6a2442e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f7b0db-e402-4898-a011-4ec4b4fff19a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=81ed667e-774b-42ce-8879-7225f57db1d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:07.955 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 81ed667e-774b-42ce-8879-7225f57db1d7 in datapath 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 unbound from our chassis
Dec 02 23:53:07 compute-1 nova_compute[187157]: 2025-12-02 23:53:07.957 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:07.958 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:53:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:07.960 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b46e7171-58ed-4c1e-809d-8104399b5ad1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:07 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:07.961 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 namespace which is not needed anymore
Dec 02 23:53:07 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 02 23:53:07 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 12.079s CPU time.
Dec 02 23:53:07 compute-1 systemd-machined[153454]: Machine qemu-1-instance-00000003 terminated.
Dec 02 23:53:08 compute-1 podman[209496]: 2025-12-02 23:53:08.143698863 +0000 UTC m=+0.048859826 container kill 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 02 23:53:08 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [NOTICE]   (209415) : haproxy version is 3.0.5-8e879a5
Dec 02 23:53:08 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [NOTICE]   (209415) : path to executable is /usr/sbin/haproxy
Dec 02 23:53:08 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [WARNING]  (209415) : Exiting Master process...
Dec 02 23:53:08 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [ALERT]    (209415) : Current worker (209417) exited with code 143 (Terminated)
Dec 02 23:53:08 compute-1 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[209411]: [WARNING]  (209415) : All workers exited. Exiting... (0)
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.148 187161 INFO nova.virt.libvirt.driver [-] [instance: df034907-3b38-4357-af57-2750b437bf22] Instance destroyed successfully.
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.149 187161 DEBUG nova.objects.instance [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lazy-loading 'resources' on Instance uuid df034907-3b38-4357-af57-2750b437bf22 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:53:08 compute-1 systemd[1]: libpod-5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1.scope: Deactivated successfully.
Dec 02 23:53:08 compute-1 podman[209527]: 2025-12-02 23:53:08.208124883 +0000 UTC m=+0.036048518 container died 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 02 23:53:08 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1-userdata-shm.mount: Deactivated successfully.
Dec 02 23:53:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-4570458e55bbe48d8bbb35f337de4780aa677ee19cc22466b3b068d4d0bfc31e-merged.mount: Deactivated successfully.
Dec 02 23:53:08 compute-1 podman[209527]: 2025-12-02 23:53:08.25124583 +0000 UTC m=+0.079169445 container cleanup 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:53:08 compute-1 systemd[1]: libpod-conmon-5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1.scope: Deactivated successfully.
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.266 187161 DEBUG nova.compute.manager [req-b306e07c-8e0e-48a8-b2c8-51ed7bd7fdc0 req-d56189bd-0cb9-460b-8ecb-b3094faed9da 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-unplugged-81ed667e-774b-42ce-8879-7225f57db1d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.268 187161 DEBUG oslo_concurrency.lockutils [req-b306e07c-8e0e-48a8-b2c8-51ed7bd7fdc0 req-d56189bd-0cb9-460b-8ecb-b3094faed9da 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.268 187161 DEBUG oslo_concurrency.lockutils [req-b306e07c-8e0e-48a8-b2c8-51ed7bd7fdc0 req-d56189bd-0cb9-460b-8ecb-b3094faed9da 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.268 187161 DEBUG oslo_concurrency.lockutils [req-b306e07c-8e0e-48a8-b2c8-51ed7bd7fdc0 req-d56189bd-0cb9-460b-8ecb-b3094faed9da 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.268 187161 DEBUG nova.compute.manager [req-b306e07c-8e0e-48a8-b2c8-51ed7bd7fdc0 req-d56189bd-0cb9-460b-8ecb-b3094faed9da 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] No waiting events found dispatching network-vif-unplugged-81ed667e-774b-42ce-8879-7225f57db1d7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.269 187161 DEBUG nova.compute.manager [req-b306e07c-8e0e-48a8-b2c8-51ed7bd7fdc0 req-d56189bd-0cb9-460b-8ecb-b3094faed9da 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-unplugged-81ed667e-774b-42ce-8879-7225f57db1d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:53:08 compute-1 podman[209529]: 2025-12-02 23:53:08.271297532 +0000 UTC m=+0.084189026 container remove 5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.280 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[da74319a-603f-4f81-a8ba-bc7a1e5190b3]: (4, ("Tue Dec  2 11:53:08 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 (5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1)\n5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1\nTue Dec  2 11:53:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 (5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1)\n5fa865e55de0f794d81e04f4e67516d85828ef930cbd8b2bd20294d4d12b51e1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.281 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[caeb90c2-d1cc-4b86-8ea1-a2a1693786bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.281 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.282 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[250c3440-7c51-42eb-8897-63d724145c63]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.282 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c29168d-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.284 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 kernel: tap2c29168d-80: left promiscuous mode
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.302 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.304 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0e863052-e03d-42c2-bb4b-4a60cf71122a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.323 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[617b8245-042c-4dc8-be8c-e5b3b0d4a006]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.324 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[020f3a46-c7f6-49d5-835b-975305eafced]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.343 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ef56c353-61fd-4fea-8330-b039d0edced3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359707, 'reachable_time': 33644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209561, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 systemd[1]: run-netns-ovnmeta\x2d2c29168d\x2d89f5\x2d4fdd\x2da1dd\x2d76c0a34cef80.mount: Deactivated successfully.
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.348 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 02 23:53:08 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:08.349 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[8179a433-cf26-47a7-9d9e-a21863b26ca6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.658 187161 DEBUG nova.virt.libvirt.vif [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-328293485',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-328293485',id=3,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='059ee4b8b9ab47ffbc539c03339a4112',ramdisk_id='',reservation_id='r-d9nchw4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-1253061916',owner_user_name='tempest-TestDataModel-1253061916-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:52:57Z,user_data=None,user_id='d032790eea2c4094b69ea4a2576bff68',uuid=df034907-3b38-4357-af57-2750b437bf22,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.659 187161 DEBUG nova.network.os_vif_util [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converting VIF {"id": "81ed667e-774b-42ce-8879-7225f57db1d7", "address": "fa:16:3e:19:e5:22", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81ed667e-77", "ovs_interfaceid": "81ed667e-774b-42ce-8879-7225f57db1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.660 187161 DEBUG nova.network.os_vif_util [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.661 187161 DEBUG os_vif [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.664 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.665 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81ed667e-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.667 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.670 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.672 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.673 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e24e4e8a-bc9c-42f0-b62d-f559098bfd5b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.674 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.676 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.680 187161 INFO os_vif [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e5:22,bridge_name='br-int',has_traffic_filtering=True,id=81ed667e-774b-42ce-8879-7225f57db1d7,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81ed667e-77')
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.681 187161 INFO nova.virt.libvirt.driver [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Deleting instance files /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22_del
Dec 02 23:53:08 compute-1 nova_compute[187157]: 2025-12-02 23:53:08.682 187161 INFO nova.virt.libvirt.driver [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Deletion of /var/lib/nova/instances/df034907-3b38-4357-af57-2750b437bf22_del complete
Dec 02 23:53:09 compute-1 nova_compute[187157]: 2025-12-02 23:53:09.201 187161 INFO nova.compute.manager [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Took 1.32 seconds to destroy the instance on the hypervisor.
Dec 02 23:53:09 compute-1 nova_compute[187157]: 2025-12-02 23:53:09.202 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:53:09 compute-1 nova_compute[187157]: 2025-12-02 23:53:09.203 187161 DEBUG nova.compute.manager [-] [instance: df034907-3b38-4357-af57-2750b437bf22] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:53:09 compute-1 nova_compute[187157]: 2025-12-02 23:53:09.203 187161 DEBUG nova.network.neutron [-] [instance: df034907-3b38-4357-af57-2750b437bf22] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:53:09 compute-1 nova_compute[187157]: 2025-12-02 23:53:09.204 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.072 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.348 187161 DEBUG nova.compute.manager [req-7223c75d-9347-417e-a329-801453cc346c req-5e6c031c-a496-400c-be90-ea400a3c8f7c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-unplugged-81ed667e-774b-42ce-8879-7225f57db1d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.348 187161 DEBUG oslo_concurrency.lockutils [req-7223c75d-9347-417e-a329-801453cc346c req-5e6c031c-a496-400c-be90-ea400a3c8f7c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "df034907-3b38-4357-af57-2750b437bf22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.348 187161 DEBUG oslo_concurrency.lockutils [req-7223c75d-9347-417e-a329-801453cc346c req-5e6c031c-a496-400c-be90-ea400a3c8f7c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.348 187161 DEBUG oslo_concurrency.lockutils [req-7223c75d-9347-417e-a329-801453cc346c req-5e6c031c-a496-400c-be90-ea400a3c8f7c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.349 187161 DEBUG nova.compute.manager [req-7223c75d-9347-417e-a329-801453cc346c req-5e6c031c-a496-400c-be90-ea400a3c8f7c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] No waiting events found dispatching network-vif-unplugged-81ed667e-774b-42ce-8879-7225f57db1d7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:53:10 compute-1 nova_compute[187157]: 2025-12-02 23:53:10.349 187161 DEBUG nova.compute.manager [req-7223c75d-9347-417e-a329-801453cc346c req-5e6c031c-a496-400c-be90-ea400a3c8f7c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-unplugged-81ed667e-774b-42ce-8879-7225f57db1d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:53:11 compute-1 podman[209563]: 2025-12-02 23:53:11.24035636 +0000 UTC m=+0.075985949 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:53:11 compute-1 nova_compute[187157]: 2025-12-02 23:53:11.249 187161 DEBUG nova.compute.manager [req-831f7b1f-3006-4306-8a04-1aac616f6571 req-4f83af4f-6488-455f-bb49-424ec5143730 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Received event network-vif-deleted-81ed667e-774b-42ce-8879-7225f57db1d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:53:11 compute-1 nova_compute[187157]: 2025-12-02 23:53:11.250 187161 INFO nova.compute.manager [req-831f7b1f-3006-4306-8a04-1aac616f6571 req-4f83af4f-6488-455f-bb49-424ec5143730 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Neutron deleted interface 81ed667e-774b-42ce-8879-7225f57db1d7; detaching it from the instance and deleting it from the info cache
Dec 02 23:53:11 compute-1 nova_compute[187157]: 2025-12-02 23:53:11.250 187161 DEBUG nova.network.neutron [req-831f7b1f-3006-4306-8a04-1aac616f6571 req-4f83af4f-6488-455f-bb49-424ec5143730 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:53:11 compute-1 nova_compute[187157]: 2025-12-02 23:53:11.427 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:11 compute-1 nova_compute[187157]: 2025-12-02 23:53:11.612 187161 DEBUG nova.network.neutron [-] [instance: df034907-3b38-4357-af57-2750b437bf22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:53:11 compute-1 nova_compute[187157]: 2025-12-02 23:53:11.759 187161 DEBUG nova.compute.manager [req-831f7b1f-3006-4306-8a04-1aac616f6571 req-4f83af4f-6488-455f-bb49-424ec5143730 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: df034907-3b38-4357-af57-2750b437bf22] Detach interface failed, port_id=81ed667e-774b-42ce-8879-7225f57db1d7, reason: Instance df034907-3b38-4357-af57-2750b437bf22 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 02 23:53:12 compute-1 nova_compute[187157]: 2025-12-02 23:53:12.120 187161 INFO nova.compute.manager [-] [instance: df034907-3b38-4357-af57-2750b437bf22] Took 2.92 seconds to deallocate network for instance.
Dec 02 23:53:12 compute-1 nova_compute[187157]: 2025-12-02 23:53:12.645 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:12 compute-1 nova_compute[187157]: 2025-12-02 23:53:12.646 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:12 compute-1 nova_compute[187157]: 2025-12-02 23:53:12.723 187161 DEBUG nova.compute.provider_tree [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.271 187161 ERROR nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [req-f0abbfa7-f3d4-4221-9325-026c1bbc07fc] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID a6c5ccbf-f26a-4e87-95da-56336ae0b343.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-f0abbfa7-f3d4-4221-9325-026c1bbc07fc"}]}
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.301 187161 DEBUG nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.322 187161 DEBUG nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.322 187161 DEBUG nova.compute.provider_tree [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.343 187161 DEBUG nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.367 187161 DEBUG nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.419 187161 DEBUG nova.compute.provider_tree [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.699 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.977 187161 DEBUG nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updated inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.979 187161 DEBUG nova.compute.provider_tree [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updating resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 02 23:53:13 compute-1 nova_compute[187157]: 2025-12-02 23:53:13.979 187161 DEBUG nova.compute.provider_tree [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:53:14 compute-1 nova_compute[187157]: 2025-12-02 23:53:14.496 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.851s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:14 compute-1 nova_compute[187157]: 2025-12-02 23:53:14.527 187161 INFO nova.scheduler.client.report [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Deleted allocations for instance df034907-3b38-4357-af57-2750b437bf22
Dec 02 23:53:15 compute-1 nova_compute[187157]: 2025-12-02 23:53:15.565 187161 DEBUG oslo_concurrency.lockutils [None req-44d9dda3-59d6-4843-b59e-43424aadae3c d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "df034907-3b38-4357-af57-2750b437bf22" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.827s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:16 compute-1 nova_compute[187157]: 2025-12-02 23:53:16.476 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:18 compute-1 nova_compute[187157]: 2025-12-02 23:53:18.702 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: ERROR   23:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: ERROR   23:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: ERROR   23:53:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: ERROR   23:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: ERROR   23:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:53:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:53:20 compute-1 podman[209584]: 2025-12-02 23:53:20.244620605 +0000 UTC m=+0.071988813 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:53:21 compute-1 nova_compute[187157]: 2025-12-02 23:53:21.478 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:23 compute-1 podman[209608]: 2025-12-02 23:53:23.279758491 +0000 UTC m=+0.103794088 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:53:23 compute-1 nova_compute[187157]: 2025-12-02 23:53:23.704 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:25 compute-1 nova_compute[187157]: 2025-12-02 23:53:25.211 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:26 compute-1 podman[209633]: 2025-12-02 23:53:26.278021761 +0000 UTC m=+0.104022973 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Dec 02 23:53:26 compute-1 nova_compute[187157]: 2025-12-02 23:53:26.480 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:27 compute-1 nova_compute[187157]: 2025-12-02 23:53:27.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:28 compute-1 nova_compute[187157]: 2025-12-02 23:53:28.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:28 compute-1 nova_compute[187157]: 2025-12-02 23:53:28.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:28 compute-1 nova_compute[187157]: 2025-12-02 23:53:28.707 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:28 compute-1 nova_compute[187157]: 2025-12-02 23:53:28.734 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:29 compute-1 nova_compute[187157]: 2025-12-02 23:53:29.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.218 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.218 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.218 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.438 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.440 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.469 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.470 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5867MB free_disk=73.16728210449219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.471 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:30 compute-1 nova_compute[187157]: 2025-12-02 23:53:30.471 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:31 compute-1 nova_compute[187157]: 2025-12-02 23:53:31.530 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:31 compute-1 nova_compute[187157]: 2025-12-02 23:53:31.852 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:53:31 compute-1 nova_compute[187157]: 2025-12-02 23:53:31.852 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:53:30 up  1:00,  0 user,  load average: 0.25, 0.21, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:53:31 compute-1 nova_compute[187157]: 2025-12-02 23:53:31.926 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:53:32 compute-1 nova_compute[187157]: 2025-12-02 23:53:32.536 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:53:33 compute-1 nova_compute[187157]: 2025-12-02 23:53:33.047 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:53:33 compute-1 nova_compute[187157]: 2025-12-02 23:53:33.048 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.577s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:33 compute-1 nova_compute[187157]: 2025-12-02 23:53:33.709 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:34 compute-1 nova_compute[187157]: 2025-12-02 23:53:34.050 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:34 compute-1 nova_compute[187157]: 2025-12-02 23:53:34.051 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:34 compute-1 nova_compute[187157]: 2025-12-02 23:53:34.052 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:34 compute-1 nova_compute[187157]: 2025-12-02 23:53:34.052 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:53:35 compute-1 podman[197537]: time="2025-12-02T23:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:53:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:53:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 02 23:53:36 compute-1 podman[209656]: 2025-12-02 23:53:36.225160402 +0000 UTC m=+0.068154700 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:53:36 compute-1 nova_compute[187157]: 2025-12-02 23:53:36.532 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:38 compute-1 nova_compute[187157]: 2025-12-02 23:53:38.712 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:41 compute-1 nova_compute[187157]: 2025-12-02 23:53:41.533 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:42.088 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:0f:8c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ba5fccf757b4adaa08907c11ae17f57', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9ee451cb-cc6e-44d6-98fb-cdfa0566e521) old=Port_Binding(mac=['fa:16:3e:ab:0f:8c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ba5fccf757b4adaa08907c11ae17f57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:42.089 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9ee451cb-cc6e-44d6-98fb-cdfa0566e521 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a updated
Dec 02 23:53:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:42.090 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:53:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:42.091 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b5808be2-2356-46ac-95c6-746ff70270f3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:42 compute-1 podman[209677]: 2025-12-02 23:53:42.254766307 +0000 UTC m=+0.092791234 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Dec 02 23:53:43 compute-1 nova_compute[187157]: 2025-12-02 23:53:43.714 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:46 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:46.431 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:46 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:46.433 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:53:46 compute-1 nova_compute[187157]: 2025-12-02 23:53:46.473 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:46 compute-1 nova_compute[187157]: 2025-12-02 23:53:46.535 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:48 compute-1 nova_compute[187157]: 2025-12-02 23:53:48.748 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: ERROR   23:53:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: ERROR   23:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: ERROR   23:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: ERROR   23:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: ERROR   23:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:53:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:53:51 compute-1 podman[209699]: 2025-12-02 23:53:51.253665204 +0000 UTC m=+0.085164550 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:53:51 compute-1 nova_compute[187157]: 2025-12-02 23:53:51.537 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:52.107 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:ed:a4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c8e6c60-6229-4bb7-a7c1-8d84b1a1b4af, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=46cd7a9e-86e4-4aaf-b5c5-07e80f59a989) old=Port_Binding(mac=['fa:16:3e:35:ed:a4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:52.108 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 46cd7a9e-86e4-4aaf-b5c5-07e80f59a989 in datapath cb0fa041-521f-435f-82ea-d7eab4f5ab40 updated
Dec 02 23:53:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:52.108 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cb0fa041-521f-435f-82ea-d7eab4f5ab40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:53:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:52.109 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[19649c82-5bba-4e95-8209-9172c4219dbc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:53 compute-1 nova_compute[187157]: 2025-12-02 23:53:53.751 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:54 compute-1 podman[209725]: 2025-12-02 23:53:54.249394931 +0000 UTC m=+0.092043504 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 02 23:53:54 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:53:54.435 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:56 compute-1 nova_compute[187157]: 2025-12-02 23:53:56.589 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:57 compute-1 podman[209751]: 2025-12-02 23:53:57.211072312 +0000 UTC m=+0.052411153 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:53:58 compute-1 nova_compute[187157]: 2025-12-02 23:53:58.800 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:01 compute-1 nova_compute[187157]: 2025-12-02 23:54:01.593 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:01.696 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:01.696 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:01.697 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:03 compute-1 ovn_controller[95464]: 2025-12-02T23:54:03Z|00047|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 02 23:54:03 compute-1 nova_compute[187157]: 2025-12-02 23:54:03.801 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:05 compute-1 podman[197537]: time="2025-12-02T23:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:54:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:54:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Dec 02 23:54:06 compute-1 nova_compute[187157]: 2025-12-02 23:54:06.593 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:07 compute-1 podman[209773]: 2025-12-02 23:54:07.239696424 +0000 UTC m=+0.072334580 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:54:08 compute-1 nova_compute[187157]: 2025-12-02 23:54:08.804 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:11 compute-1 nova_compute[187157]: 2025-12-02 23:54:11.598 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:13 compute-1 podman[209795]: 2025-12-02 23:54:13.224670695 +0000 UTC m=+0.067177847 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:54:13 compute-1 nova_compute[187157]: 2025-12-02 23:54:13.806 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:16 compute-1 nova_compute[187157]: 2025-12-02 23:54:16.603 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:18 compute-1 nova_compute[187157]: 2025-12-02 23:54:18.848 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: ERROR   23:54:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: ERROR   23:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: ERROR   23:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: ERROR   23:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: ERROR   23:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:54:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:54:21 compute-1 nova_compute[187157]: 2025-12-02 23:54:21.604 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:22 compute-1 podman[209815]: 2025-12-02 23:54:22.254423723 +0000 UTC m=+0.084767400 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:54:23 compute-1 nova_compute[187157]: 2025-12-02 23:54:23.851 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:25 compute-1 podman[209840]: 2025-12-02 23:54:25.279487577 +0000 UTC m=+0.112386645 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 23:54:26 compute-1 nova_compute[187157]: 2025-12-02 23:54:26.606 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:26 compute-1 nova_compute[187157]: 2025-12-02 23:54:26.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:28 compute-1 podman[209866]: 2025-12-02 23:54:28.241727269 +0000 UTC m=+0.071007208 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 02 23:54:28 compute-1 nova_compute[187157]: 2025-12-02 23:54:28.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:28 compute-1 nova_compute[187157]: 2025-12-02 23:54:28.853 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:29 compute-1 nova_compute[187157]: 2025-12-02 23:54:29.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:30 compute-1 nova_compute[187157]: 2025-12-02 23:54:30.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:30 compute-1 nova_compute[187157]: 2025-12-02 23:54:30.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:30 compute-1 nova_compute[187157]: 2025-12-02 23:54:30.699 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:54:30 compute-1 nova_compute[187157]: 2025-12-02 23:54:30.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.215 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.441 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.443 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.482 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.483 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5893MB free_disk=73.17033386230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.483 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.484 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:31 compute-1 nova_compute[187157]: 2025-12-02 23:54:31.609 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:32 compute-1 nova_compute[187157]: 2025-12-02 23:54:32.691 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:54:32 compute-1 nova_compute[187157]: 2025-12-02 23:54:32.691 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:54:31 up  1:01,  0 user,  load average: 0.11, 0.18, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:54:32 compute-1 nova_compute[187157]: 2025-12-02 23:54:32.754 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:54:33 compute-1 nova_compute[187157]: 2025-12-02 23:54:33.261 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:54:33 compute-1 nova_compute[187157]: 2025-12-02 23:54:33.774 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:54:33 compute-1 nova_compute[187157]: 2025-12-02 23:54:33.774 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.291s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:33 compute-1 nova_compute[187157]: 2025-12-02 23:54:33.855 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:35 compute-1 podman[197537]: time="2025-12-02T23:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:54:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:54:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 02 23:54:36 compute-1 sshd-session[209887]: Invalid user solana from 193.32.162.146 port 48662
Dec 02 23:54:36 compute-1 sshd-session[209887]: Connection closed by invalid user solana 193.32.162.146 port 48662 [preauth]
Dec 02 23:54:36 compute-1 nova_compute[187157]: 2025-12-02 23:54:36.612 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:36 compute-1 nova_compute[187157]: 2025-12-02 23:54:36.772 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:37 compute-1 nova_compute[187157]: 2025-12-02 23:54:37.289 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:37 compute-1 nova_compute[187157]: 2025-12-02 23:54:37.290 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:38 compute-1 podman[209889]: 2025-12-02 23:54:38.235046143 +0000 UTC m=+0.066955571 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:54:38 compute-1 nova_compute[187157]: 2025-12-02 23:54:38.308 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:38 compute-1 nova_compute[187157]: 2025-12-02 23:54:38.309 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:38 compute-1 nova_compute[187157]: 2025-12-02 23:54:38.813 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:54:38 compute-1 nova_compute[187157]: 2025-12-02 23:54:38.858 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:39 compute-1 nova_compute[187157]: 2025-12-02 23:54:39.421 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:39 compute-1 nova_compute[187157]: 2025-12-02 23:54:39.421 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:39 compute-1 nova_compute[187157]: 2025-12-02 23:54:39.428 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:54:39 compute-1 nova_compute[187157]: 2025-12-02 23:54:39.428 187161 INFO nova.compute.claims [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Claim successful on node compute-1.ctlplane.example.com
Dec 02 23:54:40 compute-1 nova_compute[187157]: 2025-12-02 23:54:40.499 187161 DEBUG nova.compute.provider_tree [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:54:41 compute-1 nova_compute[187157]: 2025-12-02 23:54:41.008 187161 DEBUG nova.scheduler.client.report [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:54:41 compute-1 nova_compute[187157]: 2025-12-02 23:54:41.519 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:41 compute-1 nova_compute[187157]: 2025-12-02 23:54:41.520 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:54:41 compute-1 nova_compute[187157]: 2025-12-02 23:54:41.615 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:42 compute-1 nova_compute[187157]: 2025-12-02 23:54:42.039 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:54:42 compute-1 nova_compute[187157]: 2025-12-02 23:54:42.040 187161 DEBUG nova.network.neutron [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:54:42 compute-1 nova_compute[187157]: 2025-12-02 23:54:42.040 187161 WARNING neutronclient.v2_0.client [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:42 compute-1 nova_compute[187157]: 2025-12-02 23:54:42.041 187161 WARNING neutronclient.v2_0.client [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:42 compute-1 nova_compute[187157]: 2025-12-02 23:54:42.551 187161 INFO nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:54:43 compute-1 nova_compute[187157]: 2025-12-02 23:54:43.062 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:54:43 compute-1 nova_compute[187157]: 2025-12-02 23:54:43.860 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.087 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.089 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.090 187161 INFO nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Creating image(s)
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.090 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "/var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.091 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.092 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.092 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.096 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.098 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.172 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.174 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.175 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.176 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.183 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.184 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:44 compute-1 podman[209910]: 2025-12-02 23:54:44.223560239 +0000 UTC m=+0.069374720 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd)
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.241 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.242 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.282 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.284 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.285 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.343 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.344 187161 DEBUG nova.virt.disk.api [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Checking if we can resize image /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.345 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.415 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.416 187161 DEBUG nova.virt.disk.api [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Cannot resize image /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.417 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.417 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Ensure instance console log exists: /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.417 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.417 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.418 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:44 compute-1 nova_compute[187157]: 2025-12-02 23:54:44.454 187161 DEBUG nova.network.neutron [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Successfully created port: 71da42a2-d97f-47f6-999c-93e4ef78e6e2 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.308 187161 DEBUG nova.network.neutron [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Successfully updated port: 71da42a2-d97f-47f6-999c-93e4ef78e6e2 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.369 187161 DEBUG nova.compute.manager [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-changed-71da42a2-d97f-47f6-999c-93e4ef78e6e2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.369 187161 DEBUG nova.compute.manager [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Refreshing instance network info cache due to event network-changed-71da42a2-d97f-47f6-999c-93e4ef78e6e2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.369 187161 DEBUG oslo_concurrency.lockutils [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-2e1c5d01-3310-41d8-8a6d-780b09f6bf06" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.369 187161 DEBUG oslo_concurrency.lockutils [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-2e1c5d01-3310-41d8-8a6d-780b09f6bf06" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.369 187161 DEBUG nova.network.neutron [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Refreshing network info cache for port 71da42a2-d97f-47f6-999c-93e4ef78e6e2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.815 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "refresh_cache-2e1c5d01-3310-41d8-8a6d-780b09f6bf06" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:54:45 compute-1 nova_compute[187157]: 2025-12-02 23:54:45.874 187161 WARNING neutronclient.v2_0.client [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:46 compute-1 nova_compute[187157]: 2025-12-02 23:54:46.067 187161 DEBUG nova.network.neutron [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:54:46 compute-1 nova_compute[187157]: 2025-12-02 23:54:46.212 187161 DEBUG nova.network.neutron [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:54:46 compute-1 nova_compute[187157]: 2025-12-02 23:54:46.617 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:46 compute-1 nova_compute[187157]: 2025-12-02 23:54:46.881 187161 DEBUG oslo_concurrency.lockutils [req-c4c4097e-80c4-465c-9682-f1dac00c17bb req-628143d7-1eeb-45cb-a403-e784fc4f8c65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-2e1c5d01-3310-41d8-8a6d-780b09f6bf06" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:54:46 compute-1 nova_compute[187157]: 2025-12-02 23:54:46.882 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquired lock "refresh_cache-2e1c5d01-3310-41d8-8a6d-780b09f6bf06" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:54:46 compute-1 nova_compute[187157]: 2025-12-02 23:54:46.882 187161 DEBUG nova.network.neutron [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:54:47 compute-1 nova_compute[187157]: 2025-12-02 23:54:47.501 187161 DEBUG nova.network.neutron [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:54:47 compute-1 nova_compute[187157]: 2025-12-02 23:54:47.730 187161 WARNING neutronclient.v2_0.client [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:47 compute-1 nova_compute[187157]: 2025-12-02 23:54:47.921 187161 DEBUG nova.network.neutron [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Updating instance_info_cache with network_info: [{"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.427 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Releasing lock "refresh_cache-2e1c5d01-3310-41d8-8a6d-780b09f6bf06" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.428 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Instance network_info: |[{"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.432 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Start _get_guest_xml network_info=[{"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.438 187161 WARNING nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.440 187161 DEBUG nova.virt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1874250387', uuid='2e1c5d01-3310-41d8-8a6d-780b09f6bf06'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719688.4405167) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.447 187161 DEBUG nova.virt.libvirt.host [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.448 187161 DEBUG nova.virt.libvirt.host [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.452 187161 DEBUG nova.virt.libvirt.host [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.453 187161 DEBUG nova.virt.libvirt.host [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.455 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.455 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.456 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.457 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.457 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.458 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.458 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.458 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.459 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.460 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.460 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.460 187161 DEBUG nova.virt.hardware [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.469 187161 DEBUG nova.virt.libvirt.vif [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:54:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1874250387',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1874250387',id=5,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-zhbwq71l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:54:43Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=2e1c5d01-3310-41d8-8a6d-780b09f6bf06,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.469 187161 DEBUG nova.network.os_vif_util [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.470 187161 DEBUG nova.network.os_vif_util [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.472 187161 DEBUG nova.objects.instance [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e1c5d01-3310-41d8-8a6d-780b09f6bf06 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.865 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.983 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <uuid>2e1c5d01-3310-41d8-8a6d-780b09f6bf06</uuid>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <name>instance-00000005</name>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <metadata>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1874250387</nova:name>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-02 23:54:48</nova:creationTime>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 02 23:54:48 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:properties>
Dec 02 23:54:48 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         </nova:properties>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       </nova:image>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:owner>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       </nova:owner>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <nova:ports>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         <nova:port uuid="71da42a2-d97f-47f6-999c-93e4ef78e6e2">
Dec 02 23:54:48 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:         </nova:port>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       </nova:ports>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </nova:instance>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </metadata>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <system>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <entry name="serial">2e1c5d01-3310-41d8-8a6d-780b09f6bf06</entry>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <entry name="uuid">2e1c5d01-3310-41d8-8a6d-780b09f6bf06</entry>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </system>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </sysinfo>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <os>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </os>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <features>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <acpi/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <apic/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </features>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </clock>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.config"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:e2:d6:d0"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <target dev="tap71da42a2-d9"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/console.log" append="off"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </serial>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <video>
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </video>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:54:48 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 02 23:54:48 compute-1 nova_compute[187157]:     </memballoon>
Dec 02 23:54:48 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:54:48 compute-1 nova_compute[187157]: </domain>
Dec 02 23:54:48 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.985 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Preparing to wait for external event network-vif-plugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.985 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.985 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.986 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.986 187161 DEBUG nova.virt.libvirt.vif [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:54:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1874250387',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1874250387',id=5,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-zhbwq71l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:54:43Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=2e1c5d01-3310-41d8-8a6d-780b09f6bf06,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.987 187161 DEBUG nova.network.os_vif_util [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.987 187161 DEBUG nova.network.os_vif_util [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.988 187161 DEBUG os_vif [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.988 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.989 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.989 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.989 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.990 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8290ca1b-c9fa-5803-8279-5fd9bf71cf78', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.991 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.993 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.997 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.998 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71da42a2-d9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:48 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.999 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap71da42a2-d9, col_values=(('qos', UUID('da881451-716c-4c52-b838-a3641b35a33a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:49 compute-1 nova_compute[187157]: 2025-12-02 23:54:48.999 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap71da42a2-d9, col_values=(('external_ids', {'iface-id': '71da42a2-d97f-47f6-999c-93e4ef78e6e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:d6:d0', 'vm-uuid': '2e1c5d01-3310-41d8-8a6d-780b09f6bf06'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:49 compute-1 NetworkManager[55553]: <info>  [1764719689.0028] manager: (tap71da42a2-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec 02 23:54:49 compute-1 nova_compute[187157]: 2025-12-02 23:54:49.001 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:49 compute-1 nova_compute[187157]: 2025-12-02 23:54:49.006 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:54:49 compute-1 nova_compute[187157]: 2025-12-02 23:54:49.010 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:49 compute-1 nova_compute[187157]: 2025-12-02 23:54:49.011 187161 INFO os_vif [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9')
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: ERROR   23:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: ERROR   23:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: ERROR   23:54:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: ERROR   23:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: ERROR   23:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:54:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:54:50 compute-1 nova_compute[187157]: 2025-12-02 23:54:50.768 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:54:50 compute-1 nova_compute[187157]: 2025-12-02 23:54:50.769 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:54:50 compute-1 nova_compute[187157]: 2025-12-02 23:54:50.769 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No VIF found with MAC fa:16:3e:e2:d6:d0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:54:50 compute-1 nova_compute[187157]: 2025-12-02 23:54:50.770 187161 INFO nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Using config drive
Dec 02 23:54:51 compute-1 nova_compute[187157]: 2025-12-02 23:54:51.586 187161 WARNING neutronclient.v2_0.client [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:51 compute-1 nova_compute[187157]: 2025-12-02 23:54:51.668 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.153 187161 INFO nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Creating config drive at /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.config
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.167 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpzluugrcp execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.310 187161 DEBUG oslo_concurrency.processutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpzluugrcp" returned: 0 in 0.144s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:52 compute-1 kernel: tap71da42a2-d9: entered promiscuous mode
Dec 02 23:54:52 compute-1 ovn_controller[95464]: 2025-12-02T23:54:52Z|00048|binding|INFO|Claiming lport 71da42a2-d97f-47f6-999c-93e4ef78e6e2 for this chassis.
Dec 02 23:54:52 compute-1 ovn_controller[95464]: 2025-12-02T23:54:52Z|00049|binding|INFO|71da42a2-d97f-47f6-999c-93e4ef78e6e2: Claiming fa:16:3e:e2:d6:d0 10.100.0.7
Dec 02 23:54:52 compute-1 NetworkManager[55553]: <info>  [1764719692.4117] manager: (tap71da42a2-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.411 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.416 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 systemd-udevd[209978]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:54:52 compute-1 podman[209949]: 2025-12-02 23:54:52.458266176 +0000 UTC m=+0.091714747 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:54:52 compute-1 systemd-machined[153454]: New machine qemu-2-instance-00000005.
Dec 02 23:54:52 compute-1 NetworkManager[55553]: <info>  [1764719692.4666] device (tap71da42a2-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:54:52 compute-1 NetworkManager[55553]: <info>  [1764719692.4673] device (tap71da42a2-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.471 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Dec 02 23:54:52 compute-1 ovn_controller[95464]: 2025-12-02T23:54:52Z|00050|binding|INFO|Setting lport 71da42a2-d97f-47f6-999c-93e4ef78e6e2 ovn-installed in OVS
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.477 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 ovn_controller[95464]: 2025-12-02T23:54:52Z|00051|binding|INFO|Setting lport 71da42a2-d97f-47f6-999c-93e4ef78e6e2 up in Southbound
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.553 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:d6:d0 10.100.0.7'], port_security=['fa:16:3e:e2:d6:d0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2e1c5d01-3310-41d8-8a6d-780b09f6bf06', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=71da42a2-d97f-47f6-999c-93e4ef78e6e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.554 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 71da42a2-d97f-47f6-999c-93e4ef78e6e2 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.556 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.573 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e6132431-9be6-491c-bf9e-14ecbc9feccc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.574 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec494140-a1 in ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.579 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec494140-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.579 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e12d99eb-2349-47e4-91e6-4c32cef7b484]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.580 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b5db2303-3725-4916-9e7d-5c1d8450d118]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.599 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[4752c872-159e-4656-ae76-fcd0f20ece7d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.619 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e88774ed-a60b-4004-b3f1-ebbe375cce37]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.649 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[b80b430d-ca52-4550-ae71-7b9af456265b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 NetworkManager[55553]: <info>  [1764719692.6551] manager: (tapec494140-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.654 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9722b9-028e-48c6-a679-a3951531c2c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.691 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d94ab906-d92f-470b-a3d3-999625522605]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.694 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d7eca4e8-4322-49d0-95f1-1463d5ce6d8d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 NetworkManager[55553]: <info>  [1764719692.7183] device (tapec494140-a0): carrier: link connected
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.724 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[44585d91-68f6-4209-9c38-05917da73565]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.742 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e496d2dc-b40f-48b9-b5f0-9aebdbe33ace]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210022, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.758 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[063293a7-0cec-48a8-b90f-28bf061836de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:f8c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371237, 'tstamp': 371237}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210023, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.776 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b25e14c6-f938-4cdf-959b-34cde78f21b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210024, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.809 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0b513179-3975-4d63-9076-6c00908c9e69]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.880 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c96dd7-63e3-4218-bb3e-4220810657d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.881 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.881 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.881 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.883 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 NetworkManager[55553]: <info>  [1764719692.8841] manager: (tapec494140-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec 02 23:54:52 compute-1 kernel: tapec494140-a0: entered promiscuous mode
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.886 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.886 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 ovn_controller[95464]: 2025-12-02T23:54:52Z|00052|binding|INFO|Releasing lport 9ee451cb-cc6e-44d6-98fb-cdfa0566e521 from this chassis (sb_readonly=0)
Dec 02 23:54:52 compute-1 nova_compute[187157]: 2025-12-02 23:54:52.897 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.899 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[308fc873-232f-41e1-bcce-8acb360b3b40]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.900 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.900 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.900 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ec494140-a5f4-4327-8807-d7248b1cdc9a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.900 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.901 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd5ab69-59ed-431d-8931-6bc38aca68d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.901 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.901 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae716da-02c8-44c8-b501-340e2f9b3187]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.901 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: global
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: defaults
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     log global
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:54:52 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:54:52.902 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'env', 'PROCESS_TAG=haproxy-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec494140-a5f4-4327-8807-d7248b1cdc9a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.102 187161 DEBUG nova.compute.manager [req-5f86d1a1-eecb-4449-ba8e-a7125eb24d94 req-80e85f84-1e78-4e6f-bc6e-dbce9cc3aea0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-plugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.103 187161 DEBUG oslo_concurrency.lockutils [req-5f86d1a1-eecb-4449-ba8e-a7125eb24d94 req-80e85f84-1e78-4e6f-bc6e-dbce9cc3aea0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.103 187161 DEBUG oslo_concurrency.lockutils [req-5f86d1a1-eecb-4449-ba8e-a7125eb24d94 req-80e85f84-1e78-4e6f-bc6e-dbce9cc3aea0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.104 187161 DEBUG oslo_concurrency.lockutils [req-5f86d1a1-eecb-4449-ba8e-a7125eb24d94 req-80e85f84-1e78-4e6f-bc6e-dbce9cc3aea0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.104 187161 DEBUG nova.compute.manager [req-5f86d1a1-eecb-4449-ba8e-a7125eb24d94 req-80e85f84-1e78-4e6f-bc6e-dbce9cc3aea0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Processing event network-vif-plugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.255 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.259 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.263 187161 INFO nova.virt.libvirt.driver [-] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Instance spawned successfully.
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.264 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:54:53 compute-1 podman[210063]: 2025-12-02 23:54:53.301361592 +0000 UTC m=+0.057234197 container create 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 23:54:53 compute-1 systemd[1]: Started libpod-conmon-564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856.scope.
Dec 02 23:54:53 compute-1 systemd[1]: Started libcrun container.
Dec 02 23:54:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b6af4d22daec3c1512af11c35f2d6e36f051643211d608a6c556ba1f66c6ffd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:54:53 compute-1 podman[210063]: 2025-12-02 23:54:53.278695317 +0000 UTC m=+0.034567942 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:54:53 compute-1 podman[210063]: 2025-12-02 23:54:53.387844092 +0000 UTC m=+0.143716707 container init 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:54:53 compute-1 podman[210063]: 2025-12-02 23:54:53.395000814 +0000 UTC m=+0.150873409 container start 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 02 23:54:53 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [NOTICE]   (210082) : New worker (210084) forked
Dec 02 23:54:53 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [NOTICE]   (210082) : Loading success.
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.780 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.781 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.782 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.783 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.784 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:53 compute-1 nova_compute[187157]: 2025-12-02 23:54:53.785 187161 DEBUG nova.virt.libvirt.driver [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:54 compute-1 nova_compute[187157]: 2025-12-02 23:54:54.003 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:54 compute-1 nova_compute[187157]: 2025-12-02 23:54:54.608 187161 INFO nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Took 10.52 seconds to spawn the instance on the hypervisor.
Dec 02 23:54:54 compute-1 nova_compute[187157]: 2025-12-02 23:54:54.609 187161 DEBUG nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.152 187161 INFO nova.compute.manager [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Took 15.82 seconds to build instance.
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.392 187161 DEBUG nova.compute.manager [req-992edb0f-5d09-4b6b-9c11-fb20433a5dec req-06dc4ec6-f13e-4fe1-aec0-9639d2abe3ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-plugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.393 187161 DEBUG oslo_concurrency.lockutils [req-992edb0f-5d09-4b6b-9c11-fb20433a5dec req-06dc4ec6-f13e-4fe1-aec0-9639d2abe3ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.393 187161 DEBUG oslo_concurrency.lockutils [req-992edb0f-5d09-4b6b-9c11-fb20433a5dec req-06dc4ec6-f13e-4fe1-aec0-9639d2abe3ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.393 187161 DEBUG oslo_concurrency.lockutils [req-992edb0f-5d09-4b6b-9c11-fb20433a5dec req-06dc4ec6-f13e-4fe1-aec0-9639d2abe3ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.393 187161 DEBUG nova.compute.manager [req-992edb0f-5d09-4b6b-9c11-fb20433a5dec req-06dc4ec6-f13e-4fe1-aec0-9639d2abe3ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] No waiting events found dispatching network-vif-plugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.394 187161 WARNING nova.compute.manager [req-992edb0f-5d09-4b6b-9c11-fb20433a5dec req-06dc4ec6-f13e-4fe1-aec0-9639d2abe3ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received unexpected event network-vif-plugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 for instance with vm_state active and task_state None.
Dec 02 23:54:55 compute-1 nova_compute[187157]: 2025-12-02 23:54:55.658 187161 DEBUG oslo_concurrency.lockutils [None req-be9d9737-e9f9-4c6b-b4ec-b4662cd710b5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.349s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:56 compute-1 podman[210093]: 2025-12-02 23:54:56.314144861 +0000 UTC m=+0.134495946 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 02 23:54:56 compute-1 nova_compute[187157]: 2025-12-02 23:54:56.670 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:59 compute-1 nova_compute[187157]: 2025-12-02 23:54:59.005 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:59 compute-1 podman[210119]: 2025-12-02 23:54:59.254221301 +0000 UTC m=+0.088482909 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:55:01 compute-1 nova_compute[187157]: 2025-12-02 23:55:01.672 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:01.698 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:01.699 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:01.699 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:04 compute-1 nova_compute[187157]: 2025-12-02 23:55:04.053 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:05 compute-1 podman[197537]: time="2025-12-02T23:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:55:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:55:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Dec 02 23:55:06 compute-1 nova_compute[187157]: 2025-12-02 23:55:06.674 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:07 compute-1 ovn_controller[95464]: 2025-12-02T23:55:07Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:d6:d0 10.100.0.7
Dec 02 23:55:07 compute-1 ovn_controller[95464]: 2025-12-02T23:55:07Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:d6:d0 10.100.0.7
Dec 02 23:55:09 compute-1 nova_compute[187157]: 2025-12-02 23:55:09.055 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:09 compute-1 podman[210156]: 2025-12-02 23:55:09.227091575 +0000 UTC m=+0.067562519 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:55:11 compute-1 nova_compute[187157]: 2025-12-02 23:55:11.676 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:12 compute-1 nova_compute[187157]: 2025-12-02 23:55:12.553 187161 DEBUG nova.compute.manager [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Dec 02 23:55:13 compute-1 nova_compute[187157]: 2025-12-02 23:55:13.119 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:13 compute-1 nova_compute[187157]: 2025-12-02 23:55:13.120 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:13 compute-1 nova_compute[187157]: 2025-12-02 23:55:13.642 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:55:13 compute-1 nova_compute[187157]: 2025-12-02 23:55:13.643 187161 INFO nova.compute.claims [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Claim successful on node compute-1.ctlplane.example.com
Dec 02 23:55:14 compute-1 nova_compute[187157]: 2025-12-02 23:55:14.059 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:14 compute-1 nova_compute[187157]: 2025-12-02 23:55:14.771 187161 INFO nova.compute.resource_tracker [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating resource usage from migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a
Dec 02 23:55:14 compute-1 nova_compute[187157]: 2025-12-02 23:55:14.771 187161 DEBUG nova.compute.resource_tracker [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Starting to track incoming migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a with flavor 5e93ebd2-51fb-4d1d-bbb0-cd8e6a7d6f1d _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 02 23:55:15 compute-1 podman[210177]: 2025-12-02 23:55:15.254143846 +0000 UTC m=+0.082990473 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:55:15 compute-1 nova_compute[187157]: 2025-12-02 23:55:15.356 187161 DEBUG nova.compute.provider_tree [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:55:15 compute-1 nova_compute[187157]: 2025-12-02 23:55:15.867 187161 DEBUG nova.scheduler.client.report [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.386 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 3.265s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.386 187161 INFO nova.compute.manager [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Migrating
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.387 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.387 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.678 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.894 187161 INFO nova.compute.rpcapi [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Dec 02 23:55:16 compute-1 nova_compute[187157]: 2025-12-02 23:55:16.895 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:19 compute-1 nova_compute[187157]: 2025-12-02 23:55:19.063 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: ERROR   23:55:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: ERROR   23:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: ERROR   23:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: ERROR   23:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: ERROR   23:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:55:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:55:21 compute-1 sshd-session[210197]: Accepted publickey for nova from 192.168.122.100 port 53284 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:55:21 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Dec 02 23:55:21 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 02 23:55:21 compute-1 systemd-logind[790]: New session 27 of user nova.
Dec 02 23:55:21 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 02 23:55:21 compute-1 systemd[1]: Starting User Manager for UID 42436...
Dec 02 23:55:21 compute-1 systemd[210201]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:55:21 compute-1 systemd[210201]: Queued start job for default target Main User Target.
Dec 02 23:55:21 compute-1 systemd[210201]: Created slice User Application Slice.
Dec 02 23:55:21 compute-1 systemd[210201]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 23:55:21 compute-1 systemd[210201]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 23:55:21 compute-1 systemd[210201]: Reached target Paths.
Dec 02 23:55:21 compute-1 systemd[210201]: Reached target Timers.
Dec 02 23:55:21 compute-1 systemd[210201]: Starting D-Bus User Message Bus Socket...
Dec 02 23:55:21 compute-1 systemd[210201]: Starting Create User's Volatile Files and Directories...
Dec 02 23:55:21 compute-1 systemd[210201]: Listening on D-Bus User Message Bus Socket.
Dec 02 23:55:21 compute-1 systemd[210201]: Finished Create User's Volatile Files and Directories.
Dec 02 23:55:21 compute-1 systemd[210201]: Reached target Sockets.
Dec 02 23:55:21 compute-1 systemd[210201]: Reached target Basic System.
Dec 02 23:55:21 compute-1 systemd[210201]: Reached target Main User Target.
Dec 02 23:55:21 compute-1 systemd[210201]: Startup finished in 161ms.
Dec 02 23:55:21 compute-1 systemd[1]: Started User Manager for UID 42436.
Dec 02 23:55:21 compute-1 systemd[1]: Started Session 27 of User nova.
Dec 02 23:55:21 compute-1 sshd-session[210197]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:55:21 compute-1 sshd-session[210216]: Received disconnect from 192.168.122.100 port 53284:11: disconnected by user
Dec 02 23:55:21 compute-1 sshd-session[210216]: Disconnected from user nova 192.168.122.100 port 53284
Dec 02 23:55:21 compute-1 sshd-session[210197]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:55:21 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Dec 02 23:55:21 compute-1 systemd-logind[790]: Session 27 logged out. Waiting for processes to exit.
Dec 02 23:55:21 compute-1 systemd-logind[790]: Removed session 27.
Dec 02 23:55:21 compute-1 sshd-session[210218]: Accepted publickey for nova from 192.168.122.100 port 53288 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:55:21 compute-1 systemd-logind[790]: New session 29 of user nova.
Dec 02 23:55:21 compute-1 systemd[1]: Started Session 29 of User nova.
Dec 02 23:55:21 compute-1 sshd-session[210218]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:55:21 compute-1 sshd-session[210221]: Received disconnect from 192.168.122.100 port 53288:11: disconnected by user
Dec 02 23:55:21 compute-1 sshd-session[210221]: Disconnected from user nova 192.168.122.100 port 53288
Dec 02 23:55:21 compute-1 sshd-session[210218]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:55:21 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Dec 02 23:55:21 compute-1 systemd-logind[790]: Session 29 logged out. Waiting for processes to exit.
Dec 02 23:55:21 compute-1 systemd-logind[790]: Removed session 29.
Dec 02 23:55:21 compute-1 nova_compute[187157]: 2025-12-02 23:55:21.743 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:22 compute-1 ovn_controller[95464]: 2025-12-02T23:55:22Z|00053|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 02 23:55:23 compute-1 podman[210223]: 2025-12-02 23:55:23.236977233 +0000 UTC m=+0.073167622 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.065 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.229 187161 DEBUG nova.compute.manager [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.230 187161 DEBUG oslo_concurrency.lockutils [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.230 187161 DEBUG oslo_concurrency.lockutils [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.231 187161 DEBUG oslo_concurrency.lockutils [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.231 187161 DEBUG nova.compute.manager [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.231 187161 WARNING nova.compute.manager [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state resize_migrating.
Dec 02 23:55:24 compute-1 nova_compute[187157]: 2025-12-02 23:55:24.288 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:24.289 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:55:24 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:24.291 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:55:24 compute-1 sshd-session[210248]: Accepted publickey for nova from 192.168.122.100 port 54278 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:55:24 compute-1 systemd-logind[790]: New session 30 of user nova.
Dec 02 23:55:24 compute-1 systemd[1]: Started Session 30 of User nova.
Dec 02 23:55:24 compute-1 sshd-session[210248]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:55:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:25.294 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:25 compute-1 sshd-session[210251]: Received disconnect from 192.168.122.100 port 54278:11: disconnected by user
Dec 02 23:55:25 compute-1 sshd-session[210251]: Disconnected from user nova 192.168.122.100 port 54278
Dec 02 23:55:25 compute-1 sshd-session[210248]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:55:25 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Dec 02 23:55:25 compute-1 systemd-logind[790]: Session 30 logged out. Waiting for processes to exit.
Dec 02 23:55:25 compute-1 systemd-logind[790]: Removed session 30.
Dec 02 23:55:25 compute-1 sshd-session[210253]: Accepted publickey for nova from 192.168.122.100 port 54284 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:55:25 compute-1 systemd-logind[790]: New session 31 of user nova.
Dec 02 23:55:25 compute-1 systemd[1]: Started Session 31 of User nova.
Dec 02 23:55:25 compute-1 sshd-session[210253]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:55:25 compute-1 sshd-session[210256]: Received disconnect from 192.168.122.100 port 54284:11: disconnected by user
Dec 02 23:55:25 compute-1 sshd-session[210256]: Disconnected from user nova 192.168.122.100 port 54284
Dec 02 23:55:25 compute-1 sshd-session[210253]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:55:25 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Dec 02 23:55:25 compute-1 systemd-logind[790]: Session 31 logged out. Waiting for processes to exit.
Dec 02 23:55:25 compute-1 systemd-logind[790]: Removed session 31.
Dec 02 23:55:25 compute-1 sshd-session[210258]: Accepted publickey for nova from 192.168.122.100 port 54286 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:55:25 compute-1 systemd-logind[790]: New session 32 of user nova.
Dec 02 23:55:25 compute-1 systemd[1]: Started Session 32 of User nova.
Dec 02 23:55:25 compute-1 sshd-session[210258]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:55:25 compute-1 sshd-session[210261]: Received disconnect from 192.168.122.100 port 54286:11: disconnected by user
Dec 02 23:55:25 compute-1 sshd-session[210261]: Disconnected from user nova 192.168.122.100 port 54286
Dec 02 23:55:25 compute-1 sshd-session[210258]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:55:25 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Dec 02 23:55:25 compute-1 systemd-logind[790]: Session 32 logged out. Waiting for processes to exit.
Dec 02 23:55:25 compute-1 systemd-logind[790]: Removed session 32.
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.362 187161 DEBUG nova.compute.manager [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.362 187161 DEBUG oslo_concurrency.lockutils [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.362 187161 DEBUG oslo_concurrency.lockutils [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.362 187161 DEBUG oslo_concurrency.lockutils [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.363 187161 DEBUG nova.compute.manager [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.363 187161 WARNING nova.compute.manager [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state resize_migrating.
Dec 02 23:55:26 compute-1 nova_compute[187157]: 2025-12-02 23:55:26.745 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:27 compute-1 podman[210263]: 2025-12-02 23:55:27.275494264 +0000 UTC m=+0.111078058 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:55:27 compute-1 nova_compute[187157]: 2025-12-02 23:55:27.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:28 compute-1 nova_compute[187157]: 2025-12-02 23:55:28.756 187161 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:29 compute-1 nova_compute[187157]: 2025-12-02 23:55:29.068 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:29 compute-1 nova_compute[187157]: 2025-12-02 23:55:29.133 187161 INFO nova.network.neutron [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 02 23:55:29 compute-1 nova_compute[187157]: 2025-12-02 23:55:29.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:29 compute-1 nova_compute[187157]: 2025-12-02 23:55:29.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.210 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.211 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.211 187161 DEBUG nova.network.neutron [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:55:30 compute-1 podman[210289]: 2025-12-02 23:55:30.256876079 +0000 UTC m=+0.084127330 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.290 187161 DEBUG nova.compute.manager [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-changed-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.291 187161 DEBUG nova.compute.manager [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Refreshing instance network info cache due to event network-changed-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.291 187161 DEBUG oslo_concurrency.lockutils [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:30 compute-1 nova_compute[187157]: 2025-12-02 23:55:30.717 187161 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:31 compute-1 nova_compute[187157]: 2025-12-02 23:55:31.337 187161 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:31 compute-1 nova_compute[187157]: 2025-12-02 23:55:31.529 187161 DEBUG nova.network.neutron [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:31 compute-1 nova_compute[187157]: 2025-12-02 23:55:31.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:31 compute-1 nova_compute[187157]: 2025-12-02 23:55:31.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:31 compute-1 nova_compute[187157]: 2025-12-02 23:55:31.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:55:31 compute-1 nova_compute[187157]: 2025-12-02 23:55:31.747 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:32 compute-1 nova_compute[187157]: 2025-12-02 23:55:32.586 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:32 compute-1 nova_compute[187157]: 2025-12-02 23:55:32.592 187161 DEBUG oslo_concurrency.lockutils [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:32 compute-1 nova_compute[187157]: 2025-12-02 23:55:32.593 187161 DEBUG nova.network.neutron [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Refreshing network info cache for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:55:32 compute-1 nova_compute[187157]: 2025-12-02 23:55:32.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.108 187161 WARNING neutronclient.v2_0.client [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.233 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.233 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.233 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.234 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.276 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.278 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.278 187161 INFO nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Creating image(s)
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.280 187161 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.381 187161 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.382 187161 DEBUG nova.virt.disk.api [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.382 187161 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.456 187161 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.456 187161 DEBUG nova.virt.disk.api [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.962 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.963 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Ensure instance console log exists: /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.964 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.964 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.964 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.967 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Start _get_guest_xml network_info=[{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.972 187161 WARNING nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.974 187161 DEBUG nova.virt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-607610768', uuid='d8ccd45c-e570-4b75-b836-a93e2de1818b'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.micro', flavorid='5e93ebd2-51fb-4d1d-bbb0-cd8e6a7d6f1d', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719733.9747405) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.979 187161 DEBUG nova.virt.libvirt.host [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.980 187161 DEBUG nova.virt.libvirt.host [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.983 187161 DEBUG nova.virt.libvirt.host [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.984 187161 DEBUG nova.virt.libvirt.host [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.986 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.986 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5e93ebd2-51fb-4d1d-bbb0-cd8e6a7d6f1d',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.986 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.987 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.987 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.987 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.987 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.988 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.988 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.988 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.988 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.989 187161 DEBUG nova.virt.hardware [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:55:33 compute-1 nova_compute[187157]: 2025-12-02 23:55:33.993 187161 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.066 187161 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.067 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.067 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.069 187161 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.070 187161 DEBUG nova.virt.libvirt.vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:54:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:55:26Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.070 187161 DEBUG nova.network.os_vif_util [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.071 187161 DEBUG nova.network.os_vif_util [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.074 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <uuid>d8ccd45c-e570-4b75-b836-a93e2de1818b</uuid>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <name>instance-00000004</name>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <memory>196608</memory>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <metadata>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-607610768</nova:name>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-02 23:55:33</nova:creationTime>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:flavor name="m1.micro" id="5e93ebd2-51fb-4d1d-bbb0-cd8e6a7d6f1d">
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:memory>192</nova:memory>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:properties>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_input_bus">usb</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_machine_type">q35</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_video_model">virtio</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:property name="hw_vif_model">virtio</nova:property>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         </nova:properties>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       </nova:image>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:owner>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       </nova:owner>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <nova:ports>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         <nova:port uuid="fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe">
Dec 02 23:55:34 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:         </nova:port>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       </nova:ports>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </nova:instance>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </metadata>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <system>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <entry name="serial">d8ccd45c-e570-4b75-b836-a93e2de1818b</entry>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <entry name="uuid">d8ccd45c-e570-4b75-b836-a93e2de1818b</entry>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </system>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </sysinfo>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <os>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </os>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <features>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <acpi/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <apic/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </features>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </clock>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:f8:84:51"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <target dev="tapfbb4ca60-8a"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/console.log" append="off"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </serial>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <video>
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </video>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:55:34 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 02 23:55:34 compute-1 nova_compute[187157]:     </memballoon>
Dec 02 23:55:34 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:55:34 compute-1 nova_compute[187157]: </domain>
Dec 02 23:55:34 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.075 187161 DEBUG nova.virt.libvirt.vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:54:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:55:26Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.075 187161 DEBUG nova.network.os_vif_util [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.076 187161 DEBUG nova.network.os_vif_util [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.076 187161 DEBUG os_vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.077 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.077 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.078 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.079 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.079 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8fb1aca8-1952-5444-86e5-0e27fed9a58c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.080 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.084 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.087 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.087 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbb4ca60-8a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.088 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapfbb4ca60-8a, col_values=(('qos', UUID('8ba5c139-d4ac-4606-b06f-ebf01f9ed250')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.088 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapfbb4ca60-8a, col_values=(('external_ids', {'iface-id': 'fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:84:51', 'vm-uuid': 'd8ccd45c-e570-4b75-b836-a93e2de1818b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.089 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-1 NetworkManager[55553]: <info>  [1764719734.0908] manager: (tapfbb4ca60-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.093 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.097 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.099 187161 INFO os_vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a')
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.441 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.518 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.519 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.580 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.671 187161 WARNING neutronclient.v2_0.client [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.757 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.758 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.804 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.805 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5667MB free_disk=73.11301040649414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.806 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.806 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.834 187161 DEBUG nova.network.neutron [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updated VIF entry in instance network info cache for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 02 23:55:34 compute-1 nova_compute[187157]: 2025-12-02 23:55:34.834 187161 DEBUG nova.network.neutron [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:35 compute-1 nova_compute[187157]: 2025-12-02 23:55:35.458 187161 DEBUG oslo_concurrency.lockutils [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:35 compute-1 podman[197537]: time="2025-12-02T23:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:55:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:55:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.010 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Applying migration context for instance d8ccd45c-e570-4b75-b836-a93e2de1818b as it has an incoming, in-progress migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a. Migration status is post-migrating _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.011 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating resource usage from migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.018 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.019 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.019 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No VIF found with MAC fa:16:3e:f8:84:51, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.020 187161 INFO nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Using config drive
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.042 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 2e1c5d01-3310-41d8-8a6d-780b09f6bf06 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.043 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance d8ccd45c-e570-4b75-b836-a93e2de1818b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.044 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.044 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:55:34 up  1:02,  0 user,  load average: 0.32, 0.22, 0.36\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_resize_finish': '1', 'num_os_type_None': '2', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '2', 'io_workload': '0', 'num_task_None': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:55:36 compute-1 NetworkManager[55553]: <info>  [1764719736.0861] manager: (tapfbb4ca60-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Dec 02 23:55:36 compute-1 kernel: tapfbb4ca60-8a: entered promiscuous mode
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.088 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:36 compute-1 ovn_controller[95464]: 2025-12-02T23:55:36Z|00054|binding|INFO|Claiming lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for this chassis.
Dec 02 23:55:36 compute-1 ovn_controller[95464]: 2025-12-02T23:55:36Z|00055|binding|INFO|fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe: Claiming fa:16:3e:f8:84:51 10.100.0.10
Dec 02 23:55:36 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Dec 02 23:55:36 compute-1 systemd[210201]: Activating special unit Exit the Session...
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped target Main User Target.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped target Basic System.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped target Paths.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped target Sockets.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped target Timers.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 23:55:36 compute-1 systemd[210201]: Closed D-Bus User Message Bus Socket.
Dec 02 23:55:36 compute-1 systemd[210201]: Stopped Create User's Volatile Files and Directories.
Dec 02 23:55:36 compute-1 systemd[210201]: Removed slice User Application Slice.
Dec 02 23:55:36 compute-1 systemd[210201]: Reached target Shutdown.
Dec 02 23:55:36 compute-1 systemd[210201]: Finished Exit the Session.
Dec 02 23:55:36 compute-1 systemd[210201]: Reached target Exit the Session.
Dec 02 23:55:36 compute-1 ovn_controller[95464]: 2025-12-02T23:55:36Z|00056|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe ovn-installed in OVS
Dec 02 23:55:36 compute-1 ovn_controller[95464]: 2025-12-02T23:55:36Z|00057|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe up in Southbound
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.105 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:84:51 10.100.0.10'], port_security=['fa:16:3e:f8:84:51 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8ccd45c-e570-4b75-b836-a93e2de1818b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:55:36 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.106 104348 INFO neutron.agent.ovn.metadata.agent [-] Port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.106 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:36 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.108 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:55:36 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.125 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.126 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d1afb0c7-6451-44be-948f-e49db32b92d4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 02 23:55:36 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 02 23:55:36 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 02 23:55:36 compute-1 systemd-machined[153454]: New machine qemu-3-instance-00000004.
Dec 02 23:55:36 compute-1 systemd-udevd[210347]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:55:36 compute-1 NetworkManager[55553]: <info>  [1764719736.1542] device (tapfbb4ca60-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:55:36 compute-1 NetworkManager[55553]: <info>  [1764719736.1550] device (tapfbb4ca60-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.154 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[170e3c5c-b2a0-44fd-b18b-69c3dc55c273]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.157 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[7177be8a-8692-42f3-8412-96ba4aad645d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.186 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e816d9-9757-4524-8968-ee0aa93d445b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.204 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0e902d-b42a-4781-80f7-e57ef49bd450]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210355, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.220 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[61682623-70c8-4752-9ba1-00a15d9ff5d2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210359, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210359, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.221 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.223 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.225 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.225 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.225 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.226 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.226 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:55:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:55:36.227 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fc405616-3700-4729-805e-c3970d0aed05]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.442 187161 DEBUG nova.compute.manager [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.446 187161 INFO nova.virt.libvirt.driver [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance running successfully.
Dec 02 23:55:36 compute-1 virtqemud[186882]: argument unsupported: QEMU guest agent is not configured
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.448 187161 DEBUG nova.virt.libvirt.guest [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.449 187161 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.461 187161 DEBUG nova.compute.manager [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.461 187161 DEBUG oslo_concurrency.lockutils [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.461 187161 DEBUG oslo_concurrency.lockutils [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.461 187161 DEBUG oslo_concurrency.lockutils [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.462 187161 DEBUG nova.compute.manager [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.462 187161 WARNING nova.compute.manager [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state resize_finish.
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.754 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:55:36 compute-1 nova_compute[187157]: 2025-12-02 23:55:36.809 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:37 compute-1 nova_compute[187157]: 2025-12-02 23:55:37.280 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:55:37 compute-1 nova_compute[187157]: 2025-12-02 23:55:37.281 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.475s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:38 compute-1 nova_compute[187157]: 2025-12-02 23:55:38.641 187161 DEBUG nova.compute.manager [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:38 compute-1 nova_compute[187157]: 2025-12-02 23:55:38.642 187161 DEBUG oslo_concurrency.lockutils [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:38 compute-1 nova_compute[187157]: 2025-12-02 23:55:38.643 187161 DEBUG oslo_concurrency.lockutils [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:38 compute-1 nova_compute[187157]: 2025-12-02 23:55:38.643 187161 DEBUG oslo_concurrency.lockutils [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:38 compute-1 nova_compute[187157]: 2025-12-02 23:55:38.644 187161 DEBUG nova.compute.manager [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:38 compute-1 nova_compute[187157]: 2025-12-02 23:55:38.644 187161 WARNING nova.compute.manager [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state resized and task_state None.
Dec 02 23:55:39 compute-1 nova_compute[187157]: 2025-12-02 23:55:39.092 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:39 compute-1 nova_compute[187157]: 2025-12-02 23:55:39.282 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:39 compute-1 nova_compute[187157]: 2025-12-02 23:55:39.283 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:40 compute-1 podman[210369]: 2025-12-02 23:55:40.258182641 +0000 UTC m=+0.095792445 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 23:55:41 compute-1 nova_compute[187157]: 2025-12-02 23:55:41.848 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:44 compute-1 nova_compute[187157]: 2025-12-02 23:55:44.103 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:46 compute-1 podman[210388]: 2025-12-02 23:55:46.268199832 +0000 UTC m=+0.088554615 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:55:46 compute-1 nova_compute[187157]: 2025-12-02 23:55:46.851 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:48 compute-1 ovn_controller[95464]: 2025-12-02T23:55:48Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:84:51 10.100.0.10
Dec 02 23:55:49 compute-1 nova_compute[187157]: 2025-12-02 23:55:49.107 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: ERROR   23:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: ERROR   23:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: ERROR   23:55:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: ERROR   23:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: ERROR   23:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:55:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:55:51 compute-1 nova_compute[187157]: 2025-12-02 23:55:51.915 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:54 compute-1 nova_compute[187157]: 2025-12-02 23:55:54.118 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:54 compute-1 podman[210420]: 2025-12-02 23:55:54.266054525 +0000 UTC m=+0.097811984 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:55:56 compute-1 nova_compute[187157]: 2025-12-02 23:55:56.918 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:58 compute-1 podman[210446]: 2025-12-02 23:55:58.313342162 +0000 UTC m=+0.143448833 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 02 23:55:59 compute-1 nova_compute[187157]: 2025-12-02 23:55:59.121 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:01 compute-1 podman[210472]: 2025-12-02 23:56:01.260483727 +0000 UTC m=+0.084801717 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:56:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:01.701 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:01.702 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:01.703 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:01 compute-1 nova_compute[187157]: 2025-12-02 23:56:01.921 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:04 compute-1 nova_compute[187157]: 2025-12-02 23:56:04.123 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:05 compute-1 podman[197537]: time="2025-12-02T23:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:56:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:56:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Dec 02 23:56:06 compute-1 nova_compute[187157]: 2025-12-02 23:56:06.924 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:09 compute-1 nova_compute[187157]: 2025-12-02 23:56:09.127 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:10 compute-1 nova_compute[187157]: 2025-12-02 23:56:10.137 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:10 compute-1 nova_compute[187157]: 2025-12-02 23:56:10.138 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:10 compute-1 nova_compute[187157]: 2025-12-02 23:56:10.659 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:56:11 compute-1 nova_compute[187157]: 2025-12-02 23:56:11.226 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:11 compute-1 nova_compute[187157]: 2025-12-02 23:56:11.227 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:11 compute-1 nova_compute[187157]: 2025-12-02 23:56:11.244 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:56:11 compute-1 nova_compute[187157]: 2025-12-02 23:56:11.245 187161 INFO nova.compute.claims [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Claim successful on node compute-1.ctlplane.example.com
Dec 02 23:56:11 compute-1 podman[210508]: 2025-12-02 23:56:11.271280403 +0000 UTC m=+0.100572588 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:56:11 compute-1 nova_compute[187157]: 2025-12-02 23:56:11.928 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:12 compute-1 nova_compute[187157]: 2025-12-02 23:56:12.350 187161 DEBUG nova.compute.provider_tree [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:56:12 compute-1 nova_compute[187157]: 2025-12-02 23:56:12.872 187161 DEBUG nova.scheduler.client.report [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:56:13 compute-1 nova_compute[187157]: 2025-12-02 23:56:13.422 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.195s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:13 compute-1 nova_compute[187157]: 2025-12-02 23:56:13.423 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:56:14 compute-1 nova_compute[187157]: 2025-12-02 23:56:14.122 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:56:14 compute-1 nova_compute[187157]: 2025-12-02 23:56:14.123 187161 DEBUG nova.network.neutron [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:56:14 compute-1 nova_compute[187157]: 2025-12-02 23:56:14.124 187161 WARNING neutronclient.v2_0.client [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:14 compute-1 nova_compute[187157]: 2025-12-02 23:56:14.125 187161 WARNING neutronclient.v2_0.client [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:14 compute-1 nova_compute[187157]: 2025-12-02 23:56:14.130 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:14 compute-1 nova_compute[187157]: 2025-12-02 23:56:14.634 187161 INFO nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:56:15 compute-1 nova_compute[187157]: 2025-12-02 23:56:15.181 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.219 187161 DEBUG nova.network.neutron [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Successfully created port: e54f1a66-edd4-4c1f-ae52-8de4515e4d18 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.336 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.338 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.339 187161 INFO nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Creating image(s)
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.340 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "/var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.341 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.342 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.343 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.349 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.351 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.441 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.443 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.444 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.446 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.453 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.454 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.528 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.529 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.584 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.585 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.586 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.661 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.663 187161 DEBUG nova.virt.disk.api [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Checking if we can resize image /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.664 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.744 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.746 187161 DEBUG nova.virt.disk.api [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Cannot resize image /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.747 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.748 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Ensure instance console log exists: /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.748 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.749 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.750 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:16 compute-1 nova_compute[187157]: 2025-12-02 23:56:16.928 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.166 187161 DEBUG nova.network.neutron [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Successfully updated port: e54f1a66-edd4-4c1f-ae52-8de4515e4d18 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:56:17 compute-1 podman[210545]: 2025-12-02 23:56:17.232493469 +0000 UTC m=+0.071001990 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.235 187161 DEBUG nova.compute.manager [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-changed-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.235 187161 DEBUG nova.compute.manager [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Refreshing instance network info cache due to event network-changed-e54f1a66-edd4-4c1f-ae52-8de4515e4d18. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.236 187161 DEBUG oslo_concurrency.lockutils [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-0b759275-94f4-4c19-857f-f04aa6b32c6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.236 187161 DEBUG oslo_concurrency.lockutils [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-0b759275-94f4-4c19-857f-f04aa6b32c6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.236 187161 DEBUG nova.network.neutron [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Refreshing network info cache for port e54f1a66-edd4-4c1f-ae52-8de4515e4d18 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.672 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "refresh_cache-0b759275-94f4-4c19-857f-f04aa6b32c6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.741 187161 WARNING neutronclient.v2_0.client [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:17 compute-1 nova_compute[187157]: 2025-12-02 23:56:17.894 187161 DEBUG nova.network.neutron [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:56:18 compute-1 nova_compute[187157]: 2025-12-02 23:56:18.063 187161 DEBUG nova.network.neutron [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:56:18 compute-1 nova_compute[187157]: 2025-12-02 23:56:18.571 187161 DEBUG oslo_concurrency.lockutils [req-37b51441-d82f-4b26-a99c-a0b7c8fc67cb req-7df7a17a-15e0-4ff1-b3fc-db0e08402769 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-0b759275-94f4-4c19-857f-f04aa6b32c6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:56:18 compute-1 nova_compute[187157]: 2025-12-02 23:56:18.572 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquired lock "refresh_cache-0b759275-94f4-4c19-857f-f04aa6b32c6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:56:18 compute-1 nova_compute[187157]: 2025-12-02 23:56:18.572 187161 DEBUG nova.network.neutron [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:56:19 compute-1 nova_compute[187157]: 2025-12-02 23:56:19.132 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: ERROR   23:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: ERROR   23:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: ERROR   23:56:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: ERROR   23:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: ERROR   23:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:56:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:56:20 compute-1 nova_compute[187157]: 2025-12-02 23:56:20.150 187161 DEBUG nova.network.neutron [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:56:21 compute-1 nova_compute[187157]: 2025-12-02 23:56:21.931 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.004 187161 WARNING neutronclient.v2_0.client [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.259 187161 DEBUG nova.network.neutron [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Updating instance_info_cache with network_info: [{"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.768 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Releasing lock "refresh_cache-0b759275-94f4-4c19-857f-f04aa6b32c6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.769 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Instance network_info: |[{"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.774 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Start _get_guest_xml network_info=[{"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.781 187161 WARNING nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.783 187161 DEBUG nova.virt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1322206593', uuid='0b759275-94f4-4c19-857f-f04aa6b32c6a'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719782.7836788) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.794 187161 DEBUG nova.virt.libvirt.host [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.795 187161 DEBUG nova.virt.libvirt.host [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.799 187161 DEBUG nova.virt.libvirt.host [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.800 187161 DEBUG nova.virt.libvirt.host [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.801 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.802 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.802 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.803 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.803 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.804 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.804 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.804 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.805 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.805 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.805 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.806 187161 DEBUG nova.virt.hardware [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.812 187161 DEBUG nova.virt.libvirt.vif [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:56:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1322206593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1322206593',id=7,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-3b9yeltp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:56:15Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=0b759275-94f4-4c19-857f-f04aa6b32c6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.813 187161 DEBUG nova.network.os_vif_util [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.814 187161 DEBUG nova.network.os_vif_util [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:56:22 compute-1 nova_compute[187157]: 2025-12-02 23:56:22.815 187161 DEBUG nova.objects.instance [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b759275-94f4-4c19-857f-f04aa6b32c6a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.324 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <uuid>0b759275-94f4-4c19-857f-f04aa6b32c6a</uuid>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <name>instance-00000007</name>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <metadata>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1322206593</nova:name>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-02 23:56:22</nova:creationTime>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 02 23:56:23 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:properties>
Dec 02 23:56:23 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         </nova:properties>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       </nova:image>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:owner>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       </nova:owner>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <nova:ports>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         <nova:port uuid="e54f1a66-edd4-4c1f-ae52-8de4515e4d18">
Dec 02 23:56:23 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:         </nova:port>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       </nova:ports>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </nova:instance>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </metadata>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <system>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <entry name="serial">0b759275-94f4-4c19-857f-f04aa6b32c6a</entry>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <entry name="uuid">0b759275-94f4-4c19-857f-f04aa6b32c6a</entry>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </system>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </sysinfo>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <os>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </os>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <features>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <acpi/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <apic/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </features>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </clock>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.config"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:d1:42:8f"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <target dev="tape54f1a66-ed"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/console.log" append="off"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </serial>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <video>
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </video>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:56:23 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 02 23:56:23 compute-1 nova_compute[187157]:     </memballoon>
Dec 02 23:56:23 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:56:23 compute-1 nova_compute[187157]: </domain>
Dec 02 23:56:23 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.326 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Preparing to wait for external event network-vif-plugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.327 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.327 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.327 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.329 187161 DEBUG nova.virt.libvirt.vif [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:56:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1322206593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1322206593',id=7,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-3b9yeltp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:56:15Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=0b759275-94f4-4c19-857f-f04aa6b32c6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.329 187161 DEBUG nova.network.os_vif_util [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.330 187161 DEBUG nova.network.os_vif_util [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.331 187161 DEBUG os_vif [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.332 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.332 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.333 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.334 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.334 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'dfd04725-a92b-50b0-aac4-4c6217b8a1fa', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.336 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.338 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.344 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.344 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape54f1a66-ed, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.345 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape54f1a66-ed, col_values=(('qos', UUID('b6c62ba6-223f-470e-96c7-fcd90837f806')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.346 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape54f1a66-ed, col_values=(('external_ids', {'iface-id': 'e54f1a66-edd4-4c1f-ae52-8de4515e4d18', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:42:8f', 'vm-uuid': '0b759275-94f4-4c19-857f-f04aa6b32c6a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.347 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 NetworkManager[55553]: <info>  [1764719783.3488] manager: (tape54f1a66-ed): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.350 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.404 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:23 compute-1 nova_compute[187157]: 2025-12-02 23:56:23.406 187161 INFO os_vif [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed')
Dec 02 23:56:24 compute-1 nova_compute[187157]: 2025-12-02 23:56:24.982 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:56:24 compute-1 nova_compute[187157]: 2025-12-02 23:56:24.984 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:56:24 compute-1 nova_compute[187157]: 2025-12-02 23:56:24.985 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No VIF found with MAC fa:16:3e:d1:42:8f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:56:24 compute-1 nova_compute[187157]: 2025-12-02 23:56:24.986 187161 INFO nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Using config drive
Dec 02 23:56:25 compute-1 podman[210568]: 2025-12-02 23:56:25.266038686 +0000 UTC m=+0.094281691 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:56:25 compute-1 nova_compute[187157]: 2025-12-02 23:56:25.499 187161 WARNING neutronclient.v2_0.client [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.247 187161 INFO nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Creating config drive at /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.config
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.252 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpss4at1ir execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.395 187161 DEBUG oslo_concurrency.processutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpss4at1ir" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:26 compute-1 kernel: tape54f1a66-ed: entered promiscuous mode
Dec 02 23:56:26 compute-1 NetworkManager[55553]: <info>  [1764719786.4645] manager: (tape54f1a66-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Dec 02 23:56:26 compute-1 ovn_controller[95464]: 2025-12-02T23:56:26Z|00058|binding|INFO|Claiming lport e54f1a66-edd4-4c1f-ae52-8de4515e4d18 for this chassis.
Dec 02 23:56:26 compute-1 ovn_controller[95464]: 2025-12-02T23:56:26Z|00059|binding|INFO|e54f1a66-edd4-4c1f-ae52-8de4515e4d18: Claiming fa:16:3e:d1:42:8f 10.100.0.5
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.469 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.482 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:42:8f 10.100.0.5'], port_security=['fa:16:3e:d1:42:8f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0b759275-94f4-4c19-857f-f04aa6b32c6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=e54f1a66-edd4-4c1f-ae52-8de4515e4d18) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.483 104348 INFO neutron.agent.ovn.metadata.agent [-] Port e54f1a66-edd4-4c1f-ae52-8de4515e4d18 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.485 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:56:26 compute-1 ovn_controller[95464]: 2025-12-02T23:56:26Z|00060|binding|INFO|Setting lport e54f1a66-edd4-4c1f-ae52-8de4515e4d18 ovn-installed in OVS
Dec 02 23:56:26 compute-1 ovn_controller[95464]: 2025-12-02T23:56:26Z|00061|binding|INFO|Setting lport e54f1a66-edd4-4c1f-ae52-8de4515e4d18 up in Southbound
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.503 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.506 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:26 compute-1 systemd-udevd[210608]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.515 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc7b8a4-e410-4b48-8727-6ca75e8584fe]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 NetworkManager[55553]: <info>  [1764719786.5296] device (tape54f1a66-ed): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:56:26 compute-1 NetworkManager[55553]: <info>  [1764719786.5303] device (tape54f1a66-ed): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:56:26 compute-1 systemd-machined[153454]: New machine qemu-4-instance-00000007.
Dec 02 23:56:26 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.570 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[f1733385-cd88-4097-b0f9-c81bfc637830]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.575 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[61af209b-2b00-49fc-87ba-879fffe4e583]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.624 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3ff2b8-d597-45c3-82b0-ebfb1b858f25]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.650 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[69bfacbc-2a13-438a-9e48-9703aaf6ddb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210623, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.675 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfc994b-e90e-4310-bc75-dabe513208c3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210624, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210624, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.677 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.680 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.680 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.681 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.682 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.682 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:26.684 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[478ca116-cc36-4505-b749-d42d0348a4f4]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:26 compute-1 nova_compute[187157]: 2025-12-02 23:56:26.934 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.251 187161 DEBUG nova.compute.manager [req-28b6d6cd-c00b-4948-9460-416cbe46bb1e req-94a69f42-bc61-4093-8294-4686a49e2700 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-plugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.252 187161 DEBUG oslo_concurrency.lockutils [req-28b6d6cd-c00b-4948-9460-416cbe46bb1e req-94a69f42-bc61-4093-8294-4686a49e2700 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.252 187161 DEBUG oslo_concurrency.lockutils [req-28b6d6cd-c00b-4948-9460-416cbe46bb1e req-94a69f42-bc61-4093-8294-4686a49e2700 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.253 187161 DEBUG oslo_concurrency.lockutils [req-28b6d6cd-c00b-4948-9460-416cbe46bb1e req-94a69f42-bc61-4093-8294-4686a49e2700 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.253 187161 DEBUG nova.compute.manager [req-28b6d6cd-c00b-4948-9460-416cbe46bb1e req-94a69f42-bc61-4093-8294-4686a49e2700 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Processing event network-vif-plugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.254 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.258 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.263 187161 INFO nova.virt.libvirt.driver [-] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Instance spawned successfully.
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.263 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.782 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.784 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.785 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.786 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.786 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:27 compute-1 nova_compute[187157]: 2025-12-02 23:56:27.787 187161 DEBUG nova.virt.libvirt.driver [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:28 compute-1 nova_compute[187157]: 2025-12-02 23:56:28.300 187161 INFO nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Took 11.96 seconds to spawn the instance on the hypervisor.
Dec 02 23:56:28 compute-1 nova_compute[187157]: 2025-12-02 23:56:28.301 187161 DEBUG nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:56:28 compute-1 nova_compute[187157]: 2025-12-02 23:56:28.350 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:28 compute-1 nova_compute[187157]: 2025-12-02 23:56:28.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:28 compute-1 nova_compute[187157]: 2025-12-02 23:56:28.848 187161 INFO nova.compute.manager [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Took 17.68 seconds to build instance.
Dec 02 23:56:29 compute-1 podman[210633]: 2025-12-02 23:56:29.278778259 +0000 UTC m=+0.109578582 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.322 187161 DEBUG nova.compute.manager [req-7c09c576-d9e2-416a-b6b8-e736f66d0e37 req-7ea1b33b-933c-4f93-891b-ae44c28175ff 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-plugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.322 187161 DEBUG oslo_concurrency.lockutils [req-7c09c576-d9e2-416a-b6b8-e736f66d0e37 req-7ea1b33b-933c-4f93-891b-ae44c28175ff 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.323 187161 DEBUG oslo_concurrency.lockutils [req-7c09c576-d9e2-416a-b6b8-e736f66d0e37 req-7ea1b33b-933c-4f93-891b-ae44c28175ff 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.323 187161 DEBUG oslo_concurrency.lockutils [req-7c09c576-d9e2-416a-b6b8-e736f66d0e37 req-7ea1b33b-933c-4f93-891b-ae44c28175ff 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.323 187161 DEBUG nova.compute.manager [req-7c09c576-d9e2-416a-b6b8-e736f66d0e37 req-7ea1b33b-933c-4f93-891b-ae44c28175ff 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] No waiting events found dispatching network-vif-plugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.323 187161 WARNING nova.compute.manager [req-7c09c576-d9e2-416a-b6b8-e736f66d0e37 req-7ea1b33b-933c-4f93-891b-ae44c28175ff 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received unexpected event network-vif-plugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 for instance with vm_state active and task_state None.
Dec 02 23:56:29 compute-1 nova_compute[187157]: 2025-12-02 23:56:29.355 187161 DEBUG oslo_concurrency.lockutils [None req-44e21ea3-410c-4ad5-ad1b-b4001e8d481a d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.217s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:30 compute-1 nova_compute[187157]: 2025-12-02 23:56:30.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:30 compute-1 nova_compute[187157]: 2025-12-02 23:56:30.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:31 compute-1 nova_compute[187157]: 2025-12-02 23:56:31.937 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:32 compute-1 podman[210660]: 2025-12-02 23:56:32.227141314 +0000 UTC m=+0.060892141 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Dec 02 23:56:32 compute-1 nova_compute[187157]: 2025-12-02 23:56:32.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:32 compute-1 nova_compute[187157]: 2025-12-02 23:56:32.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:32 compute-1 nova_compute[187157]: 2025-12-02 23:56:32.699 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:56:33 compute-1 nova_compute[187157]: 2025-12-02 23:56:33.392 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:34 compute-1 nova_compute[187157]: 2025-12-02 23:56:34.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:35 compute-1 nova_compute[187157]: 2025-12-02 23:56:35.223 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:35 compute-1 nova_compute[187157]: 2025-12-02 23:56:35.223 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:35 compute-1 nova_compute[187157]: 2025-12-02 23:56:35.224 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:35 compute-1 nova_compute[187157]: 2025-12-02 23:56:35.224 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:56:35 compute-1 podman[197537]: time="2025-12-02T23:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:56:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:56:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.279 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.368 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.369 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.435 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.440 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.490 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.491 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.546 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.551 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.604 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.605 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.658 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.826 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.827 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.851 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.852 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5398MB free_disk=73.11141204833984GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.853 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.853 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:36 compute-1 nova_compute[187157]: 2025-12-02 23:56:36.988 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:37 compute-1 nova_compute[187157]: 2025-12-02 23:56:37.908 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 2e1c5d01-3310-41d8-8a6d-780b09f6bf06 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:56:37 compute-1 nova_compute[187157]: 2025-12-02 23:56:37.908 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance d8ccd45c-e570-4b75-b836-a93e2de1818b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:56:37 compute-1 nova_compute[187157]: 2025-12-02 23:56:37.908 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 0b759275-94f4-4c19-857f-f04aa6b32c6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:56:37 compute-1 nova_compute[187157]: 2025-12-02 23:56:37.909 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:56:37 compute-1 nova_compute[187157]: 2025-12-02 23:56:37.909 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:56:36 up  1:03,  0 user,  load average: 0.70, 0.38, 0.41\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '3', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:56:37 compute-1 nova_compute[187157]: 2025-12-02 23:56:37.961 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:56:38 compute-1 nova_compute[187157]: 2025-12-02 23:56:38.398 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:38 compute-1 nova_compute[187157]: 2025-12-02 23:56:38.469 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:56:38 compute-1 nova_compute[187157]: 2025-12-02 23:56:38.983 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:56:38 compute-1 nova_compute[187157]: 2025-12-02 23:56:38.984 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:40 compute-1 ovn_controller[95464]: 2025-12-02T23:56:40Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:42:8f 10.100.0.5
Dec 02 23:56:40 compute-1 ovn_controller[95464]: 2025-12-02T23:56:40Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:42:8f 10.100.0.5
Dec 02 23:56:40 compute-1 nova_compute[187157]: 2025-12-02 23:56:40.983 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:41 compute-1 nova_compute[187157]: 2025-12-02 23:56:41.499 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:41 compute-1 nova_compute[187157]: 2025-12-02 23:56:41.499 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:41 compute-1 nova_compute[187157]: 2025-12-02 23:56:41.990 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:42 compute-1 podman[210725]: 2025-12-02 23:56:42.274444195 +0000 UTC m=+0.101340887 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git)
Dec 02 23:56:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:42.652 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:56:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:42.652 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:56:42 compute-1 nova_compute[187157]: 2025-12-02 23:56:42.653 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:42 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:56:42.655 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:43 compute-1 nova_compute[187157]: 2025-12-02 23:56:43.401 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:46 compute-1 nova_compute[187157]: 2025-12-02 23:56:46.993 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:48 compute-1 podman[210747]: 2025-12-02 23:56:48.253335138 +0000 UTC m=+0.091082246 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.4)
Dec 02 23:56:48 compute-1 nova_compute[187157]: 2025-12-02 23:56:48.404 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: ERROR   23:56:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: ERROR   23:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: ERROR   23:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: ERROR   23:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: ERROR   23:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:56:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:56:51 compute-1 nova_compute[187157]: 2025-12-02 23:56:51.995 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:53 compute-1 nova_compute[187157]: 2025-12-02 23:56:53.405 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:56 compute-1 podman[210779]: 2025-12-02 23:56:56.256586928 +0000 UTC m=+0.071556714 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:56:56 compute-1 nova_compute[187157]: 2025-12-02 23:56:56.996 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:58 compute-1 nova_compute[187157]: 2025-12-02 23:56:58.408 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:00 compute-1 podman[210805]: 2025-12-02 23:57:00.269358692 +0000 UTC m=+0.099090834 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest)
Dec 02 23:57:00 compute-1 nova_compute[187157]: 2025-12-02 23:57:00.970 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:00 compute-1 nova_compute[187157]: 2025-12-02 23:57:00.971 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:01 compute-1 nova_compute[187157]: 2025-12-02 23:57:01.478 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:57:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:01.704 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:01.704 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:01.705 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:01 compute-1 nova_compute[187157]: 2025-12-02 23:57:01.997 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:02 compute-1 nova_compute[187157]: 2025-12-02 23:57:02.043 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:02 compute-1 nova_compute[187157]: 2025-12-02 23:57:02.044 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:02 compute-1 nova_compute[187157]: 2025-12-02 23:57:02.055 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:57:02 compute-1 nova_compute[187157]: 2025-12-02 23:57:02.056 187161 INFO nova.compute.claims [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Claim successful on node compute-1.ctlplane.example.com
Dec 02 23:57:03 compute-1 nova_compute[187157]: 2025-12-02 23:57:03.171 187161 DEBUG nova.compute.provider_tree [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:57:03 compute-1 podman[210832]: 2025-12-02 23:57:03.220083762 +0000 UTC m=+0.057272105 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 02 23:57:03 compute-1 nova_compute[187157]: 2025-12-02 23:57:03.410 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:03 compute-1 nova_compute[187157]: 2025-12-02 23:57:03.682 187161 DEBUG nova.scheduler.client.report [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:57:04 compute-1 nova_compute[187157]: 2025-12-02 23:57:04.194 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.150s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:04 compute-1 nova_compute[187157]: 2025-12-02 23:57:04.195 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:57:04 compute-1 nova_compute[187157]: 2025-12-02 23:57:04.725 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:57:04 compute-1 nova_compute[187157]: 2025-12-02 23:57:04.726 187161 DEBUG nova.network.neutron [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:57:04 compute-1 nova_compute[187157]: 2025-12-02 23:57:04.727 187161 WARNING neutronclient.v2_0.client [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:04 compute-1 nova_compute[187157]: 2025-12-02 23:57:04.728 187161 WARNING neutronclient.v2_0.client [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:05 compute-1 nova_compute[187157]: 2025-12-02 23:57:05.245 187161 INFO nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:57:05 compute-1 podman[197537]: time="2025-12-02T23:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:57:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:57:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Dec 02 23:57:05 compute-1 nova_compute[187157]: 2025-12-02 23:57:05.753 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.294 187161 DEBUG nova.network.neutron [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Successfully created port: 3ff98f13-ac75-44a9-b36f-3c729c73fc57 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.770 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.771 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.771 187161 INFO nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Creating image(s)
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.772 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "/var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.772 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.772 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.773 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.775 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.777 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.857 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.857 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.858 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.858 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.861 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.862 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.924 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.925 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.952 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.953 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:06 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.954 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:06.999 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.026 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.026 187161 DEBUG nova.virt.disk.api [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Checking if we can resize image /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.026 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.111 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.112 187161 DEBUG nova.virt.disk.api [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Cannot resize image /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.113 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.113 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Ensure instance console log exists: /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.113 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.113 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.114 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.560 187161 DEBUG nova.network.neutron [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Successfully updated port: 3ff98f13-ac75-44a9-b36f-3c729c73fc57 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.660 187161 DEBUG nova.compute.manager [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-changed-3ff98f13-ac75-44a9-b36f-3c729c73fc57 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.660 187161 DEBUG nova.compute.manager [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Refreshing instance network info cache due to event network-changed-3ff98f13-ac75-44a9-b36f-3c729c73fc57. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.661 187161 DEBUG oslo_concurrency.lockutils [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.661 187161 DEBUG oslo_concurrency.lockutils [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:07 compute-1 nova_compute[187157]: 2025-12-02 23:57:07.662 187161 DEBUG nova.network.neutron [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Refreshing network info cache for port 3ff98f13-ac75-44a9-b36f-3c729c73fc57 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.068 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "refresh_cache-e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.168 187161 WARNING neutronclient.v2_0.client [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.254 187161 DEBUG nova.network.neutron [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.412 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.415 187161 DEBUG nova.network.neutron [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.923 187161 DEBUG oslo_concurrency.lockutils [req-c0633411-8219-4cf6-9142-b0c24d0fbd24 req-92284d6e-4833-4903-b24b-61adf53feef2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.924 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquired lock "refresh_cache-e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:08 compute-1 nova_compute[187157]: 2025-12-02 23:57:08.924 187161 DEBUG nova.network.neutron [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:57:10 compute-1 nova_compute[187157]: 2025-12-02 23:57:10.154 187161 DEBUG nova.network.neutron [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:57:11 compute-1 nova_compute[187157]: 2025-12-02 23:57:11.266 187161 WARNING neutronclient.v2_0.client [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.003 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.185 187161 DEBUG nova.network.neutron [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Updating instance_info_cache with network_info: [{"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.694 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Releasing lock "refresh_cache-e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.695 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Instance network_info: |[{"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.699 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Start _get_guest_xml network_info=[{"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.704 187161 WARNING nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.706 187161 DEBUG nova.virt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-415186711', uuid='e5b8a0f2-4b3a-4069-a535-5179df8ffa6a'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719832.706268) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.712 187161 DEBUG nova.virt.libvirt.host [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.713 187161 DEBUG nova.virt.libvirt.host [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.716 187161 DEBUG nova.virt.libvirt.host [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.716 187161 DEBUG nova.virt.libvirt.host [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.717 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.717 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.718 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.718 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.718 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.718 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.719 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.719 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.719 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.719 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.719 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.720 187161 DEBUG nova.virt.hardware [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.723 187161 DEBUG nova.virt.libvirt.vif [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:56:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-415186711',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-415186711',id=9,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-eb36m3sp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:57:05Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=e5b8a0f2-4b3a-4069-a535-5179df8ffa6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.723 187161 DEBUG nova.network.os_vif_util [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.724 187161 DEBUG nova.network.os_vif_util [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:12 compute-1 nova_compute[187157]: 2025-12-02 23:57:12.724 187161 DEBUG nova.objects.instance [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'pci_devices' on Instance uuid e5b8a0f2-4b3a-4069-a535-5179df8ffa6a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:13 compute-1 podman[210867]: 2025-12-02 23:57:13.214195237 +0000 UTC m=+0.056366188 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.233 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <uuid>e5b8a0f2-4b3a-4069-a535-5179df8ffa6a</uuid>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <name>instance-00000009</name>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <metadata>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-415186711</nova:name>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-02 23:57:12</nova:creationTime>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 02 23:57:13 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:properties>
Dec 02 23:57:13 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         </nova:properties>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       </nova:image>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:owner>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       </nova:owner>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <nova:ports>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         <nova:port uuid="3ff98f13-ac75-44a9-b36f-3c729c73fc57">
Dec 02 23:57:13 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:         </nova:port>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       </nova:ports>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </nova:instance>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </metadata>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <system>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <entry name="serial">e5b8a0f2-4b3a-4069-a535-5179df8ffa6a</entry>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <entry name="uuid">e5b8a0f2-4b3a-4069-a535-5179df8ffa6a</entry>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </system>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </sysinfo>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <os>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </os>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <features>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <acpi/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <apic/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </features>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </clock>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.config"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:e0:18:82"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <target dev="tap3ff98f13-ac"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/console.log" append="off"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </serial>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <video>
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </video>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:57:13 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 02 23:57:13 compute-1 nova_compute[187157]:     </memballoon>
Dec 02 23:57:13 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:57:13 compute-1 nova_compute[187157]: </domain>
Dec 02 23:57:13 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.234 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Preparing to wait for external event network-vif-plugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.234 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.235 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.235 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.235 187161 DEBUG nova.virt.libvirt.vif [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-02T23:56:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-415186711',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-415186711',id=9,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-eb36m3sp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:57:05Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=e5b8a0f2-4b3a-4069-a535-5179df8ffa6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.236 187161 DEBUG nova.network.os_vif_util [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.236 187161 DEBUG nova.network.os_vif_util [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.237 187161 DEBUG os_vif [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.237 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.237 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.238 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.238 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.239 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'fc7b8cbb-4a3d-5f2b-8b4d-192f551fe4e4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.239 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.240 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.242 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.242 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ff98f13-ac, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.243 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3ff98f13-ac, col_values=(('qos', UUID('1d00ae94-1d76-4674-a0c9-0bde7c692c22')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.243 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3ff98f13-ac, col_values=(('external_ids', {'iface-id': '3ff98f13-ac75-44a9-b36f-3c729c73fc57', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:18:82', 'vm-uuid': 'e5b8a0f2-4b3a-4069-a535-5179df8ffa6a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.244 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 NetworkManager[55553]: <info>  [1764719833.2449] manager: (tap3ff98f13-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.247 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.250 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:13 compute-1 nova_compute[187157]: 2025-12-02 23:57:13.250 187161 INFO os_vif [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac')
Dec 02 23:57:14 compute-1 nova_compute[187157]: 2025-12-02 23:57:14.792 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:57:14 compute-1 nova_compute[187157]: 2025-12-02 23:57:14.792 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:57:14 compute-1 nova_compute[187157]: 2025-12-02 23:57:14.792 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No VIF found with MAC fa:16:3e:e0:18:82, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:57:14 compute-1 nova_compute[187157]: 2025-12-02 23:57:14.793 187161 INFO nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Using config drive
Dec 02 23:57:15 compute-1 nova_compute[187157]: 2025-12-02 23:57:15.307 187161 WARNING neutronclient.v2_0.client [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.334 187161 INFO nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Creating config drive at /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.config
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.341 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjmrg9f_c execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.484 187161 DEBUG oslo_concurrency.processutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjmrg9f_c" returned: 0 in 0.143s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:16 compute-1 kernel: tap3ff98f13-ac: entered promiscuous mode
Dec 02 23:57:16 compute-1 NetworkManager[55553]: <info>  [1764719836.5490] manager: (tap3ff98f13-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.572 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:16 compute-1 ovn_controller[95464]: 2025-12-02T23:57:16Z|00062|binding|INFO|Claiming lport 3ff98f13-ac75-44a9-b36f-3c729c73fc57 for this chassis.
Dec 02 23:57:16 compute-1 ovn_controller[95464]: 2025-12-02T23:57:16Z|00063|binding|INFO|3ff98f13-ac75-44a9-b36f-3c729c73fc57: Claiming fa:16:3e:e0:18:82 10.100.0.3
Dec 02 23:57:16 compute-1 systemd-udevd[210907]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:57:16 compute-1 ovn_controller[95464]: 2025-12-02T23:57:16Z|00064|binding|INFO|Setting lport 3ff98f13-ac75-44a9-b36f-3c729c73fc57 ovn-installed in OVS
Dec 02 23:57:16 compute-1 ovn_controller[95464]: 2025-12-02T23:57:16Z|00065|binding|INFO|Setting lport 3ff98f13-ac75-44a9-b36f-3c729c73fc57 up in Southbound
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.602 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:18:82 10.100.0.3'], port_security=['fa:16:3e:e0:18:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e5b8a0f2-4b3a-4069-a535-5179df8ffa6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=3ff98f13-ac75-44a9-b36f-3c729c73fc57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.603 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 3ff98f13-ac75-44a9-b36f-3c729c73fc57 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.603 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.604 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.607 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:57:16 compute-1 NetworkManager[55553]: <info>  [1764719836.6165] device (tap3ff98f13-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:57:16 compute-1 NetworkManager[55553]: <info>  [1764719836.6184] device (tap3ff98f13-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:57:16 compute-1 systemd-machined[153454]: New machine qemu-5-instance-00000009.
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.631 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[429d70fe-3f4f-4ac8-b38e-5b60cc8971a1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.687 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[542631e3-cd9b-45e1-a430-ba8172a1ba35]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.692 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[05dba237-fbf4-4530-a72e-6fe8f69fb57b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.732 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4f761b-f892-4fb4-8ffe-cf679fda2af2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.746 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a90fb0-d618-4dda-8eec-c38b4b52df55]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210922, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.761 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fda8a1df-1944-4f56-b211-bd9495de5aae]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210923, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210923, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.762 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.763 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.764 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.764 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.764 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.765 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.765 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:16 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:16.766 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac66904-d8a0-4a2a-bf68-ab6c7b26a790]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.823 187161 DEBUG nova.compute.manager [req-66b84cfd-cc41-4119-b3bf-ac04949477ec req-3899f40a-028d-4f2d-a5c9-869fff63ebb4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-plugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.823 187161 DEBUG oslo_concurrency.lockutils [req-66b84cfd-cc41-4119-b3bf-ac04949477ec req-3899f40a-028d-4f2d-a5c9-869fff63ebb4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.824 187161 DEBUG oslo_concurrency.lockutils [req-66b84cfd-cc41-4119-b3bf-ac04949477ec req-3899f40a-028d-4f2d-a5c9-869fff63ebb4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.824 187161 DEBUG oslo_concurrency.lockutils [req-66b84cfd-cc41-4119-b3bf-ac04949477ec req-3899f40a-028d-4f2d-a5c9-869fff63ebb4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:16 compute-1 nova_compute[187157]: 2025-12-02 23:57:16.824 187161 DEBUG nova.compute.manager [req-66b84cfd-cc41-4119-b3bf-ac04949477ec req-3899f40a-028d-4f2d-a5c9-869fff63ebb4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Processing event network-vif-plugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.004 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.163 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.167 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.170 187161 INFO nova.virt.libvirt.driver [-] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Instance spawned successfully.
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.170 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.682 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.685 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.686 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.687 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.688 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:57:17 compute-1 nova_compute[187157]: 2025-12-02 23:57:17.689 187161 DEBUG nova.virt.libvirt.driver [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.209 187161 INFO nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Took 11.44 seconds to spawn the instance on the hypervisor.
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.209 187161 DEBUG nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.245 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.743 187161 INFO nova.compute.manager [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Took 16.75 seconds to build instance.
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.910 187161 DEBUG nova.compute.manager [req-03491f03-3753-4f24-8be0-c4660f48d46b req-597d8876-48b8-4ddb-b65e-d2c562963a0b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-plugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.910 187161 DEBUG oslo_concurrency.lockutils [req-03491f03-3753-4f24-8be0-c4660f48d46b req-597d8876-48b8-4ddb-b65e-d2c562963a0b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.911 187161 DEBUG oslo_concurrency.lockutils [req-03491f03-3753-4f24-8be0-c4660f48d46b req-597d8876-48b8-4ddb-b65e-d2c562963a0b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.911 187161 DEBUG oslo_concurrency.lockutils [req-03491f03-3753-4f24-8be0-c4660f48d46b req-597d8876-48b8-4ddb-b65e-d2c562963a0b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.911 187161 DEBUG nova.compute.manager [req-03491f03-3753-4f24-8be0-c4660f48d46b req-597d8876-48b8-4ddb-b65e-d2c562963a0b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] No waiting events found dispatching network-vif-plugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:18 compute-1 nova_compute[187157]: 2025-12-02 23:57:18.911 187161 WARNING nova.compute.manager [req-03491f03-3753-4f24-8be0-c4660f48d46b req-597d8876-48b8-4ddb-b65e-d2c562963a0b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received unexpected event network-vif-plugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 for instance with vm_state active and task_state None.
Dec 02 23:57:19 compute-1 nova_compute[187157]: 2025-12-02 23:57:19.250 187161 DEBUG oslo_concurrency.lockutils [None req-797fedfb-9807-470c-95be-8f770b9265f5 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.279s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:19 compute-1 podman[210931]: 2025-12-02 23:57:19.272598538 +0000 UTC m=+0.098330991 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: ERROR   23:57:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: ERROR   23:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: ERROR   23:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: ERROR   23:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: ERROR   23:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:57:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:57:22 compute-1 nova_compute[187157]: 2025-12-02 23:57:22.008 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:23 compute-1 nova_compute[187157]: 2025-12-02 23:57:23.249 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:27 compute-1 nova_compute[187157]: 2025-12-02 23:57:27.010 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:27 compute-1 podman[210951]: 2025-12-02 23:57:27.211375626 +0000 UTC m=+0.053698325 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:57:27 compute-1 nova_compute[187157]: 2025-12-02 23:57:27.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:27 compute-1 nova_compute[187157]: 2025-12-02 23:57:27.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 02 23:57:28 compute-1 nova_compute[187157]: 2025-12-02 23:57:28.253 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:28 compute-1 nova_compute[187157]: 2025-12-02 23:57:28.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:28 compute-1 nova_compute[187157]: 2025-12-02 23:57:28.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:29 compute-1 ovn_controller[95464]: 2025-12-02T23:57:29Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:18:82 10.100.0.3
Dec 02 23:57:29 compute-1 ovn_controller[95464]: 2025-12-02T23:57:29Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:18:82 10.100.0.3
Dec 02 23:57:31 compute-1 podman[210990]: 2025-12-02 23:57:31.384919034 +0000 UTC m=+0.212670495 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:57:31 compute-1 nova_compute[187157]: 2025-12-02 23:57:31.965 187161 DEBUG nova.compute.manager [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.012 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.206 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.497 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.498 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:32 compute-1 nova_compute[187157]: 2025-12-02 23:57:32.699 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.009 187161 DEBUG nova.objects.instance [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'pci_requests' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.160 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Creating tmpfile /var/lib/nova/instances/tmpz8owyb_p to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.161 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.257 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.289 187161 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 02 23:57:33 compute-1 podman[211019]: 2025-12-02 23:57:33.354142053 +0000 UTC m=+0.045847517 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.520 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.521 187161 INFO nova.compute.claims [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Claim successful on node compute-1.ctlplane.example.com
Dec 02 23:57:33 compute-1 nova_compute[187157]: 2025-12-02 23:57:33.521 187161 DEBUG nova.objects.instance [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'resources' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:34 compute-1 nova_compute[187157]: 2025-12-02 23:57:34.027 187161 DEBUG nova.objects.base [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<35a3db0d-2b6a-47be-bc85-4b164026935c> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:57:34 compute-1 nova_compute[187157]: 2025-12-02 23:57:34.027 187161 DEBUG nova.objects.instance [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'numa_topology' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:34 compute-1 nova_compute[187157]: 2025-12-02 23:57:34.535 187161 DEBUG nova.objects.base [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<35a3db0d-2b6a-47be-bc85-4b164026935c> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:57:34 compute-1 nova_compute[187157]: 2025-12-02 23:57:34.535 187161 DEBUG nova.objects.instance [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'pci_devices' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:35 compute-1 nova_compute[187157]: 2025-12-02 23:57:35.044 187161 DEBUG nova.objects.base [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<35a3db0d-2b6a-47be-bc85-4b164026935c> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:57:35 compute-1 nova_compute[187157]: 2025-12-02 23:57:35.388 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:35 compute-1 nova_compute[187157]: 2025-12-02 23:57:35.557 187161 INFO nova.compute.resource_tracker [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating resource usage from migration c53386d3-3d97-4c78-a0cf-66ce9d67e567
Dec 02 23:57:35 compute-1 nova_compute[187157]: 2025-12-02 23:57:35.558 187161 DEBUG nova.compute.resource_tracker [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Starting to track incoming migration c53386d3-3d97-4c78-a0cf-66ce9d67e567 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 02 23:57:35 compute-1 podman[197537]: time="2025-12-02T23:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:57:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:57:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Dec 02 23:57:36 compute-1 nova_compute[187157]: 2025-12-02 23:57:36.216 187161 DEBUG nova.compute.provider_tree [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:57:36 compute-1 nova_compute[187157]: 2025-12-02 23:57:36.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:36 compute-1 nova_compute[187157]: 2025-12-02 23:57:36.726 187161 DEBUG nova.scheduler.client.report [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.015 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.243 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.745s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.243 187161 INFO nova.compute.manager [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Migrating
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.253 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.040s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.253 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:37 compute-1 nova_compute[187157]: 2025-12-02 23:57:37.254 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.259 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.326 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.394 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.396 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.460 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.468 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.523 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.525 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.616 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.624 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.695 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.697 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.792 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.799 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.885 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.886 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-1 nova_compute[187157]: 2025-12-02 23:57:38.968 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.153 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.154 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.173 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.174 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5234MB free_disk=73.05381393432617GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.174 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.175 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:39 compute-1 nova_compute[187157]: 2025-12-02 23:57:39.242 187161 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d66a42a4-6bab-485d-a45f-0df43bf25d1b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 02 23:57:40 compute-1 nova_compute[187157]: 2025-12-02 23:57:40.198 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance d66a42a4-6bab-485d-a45f-0df43bf25d1b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 02 23:57:40 compute-1 nova_compute[187157]: 2025-12-02 23:57:40.198 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 35a3db0d-2b6a-47be-bc85-4b164026935c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 02 23:57:40 compute-1 nova_compute[187157]: 2025-12-02 23:57:40.253 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:40 compute-1 nova_compute[187157]: 2025-12-02 23:57:40.254 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:40 compute-1 nova_compute[187157]: 2025-12-02 23:57:40.254 187161 DEBUG nova.network.neutron [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:57:40 compute-1 nova_compute[187157]: 2025-12-02 23:57:40.764 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.216 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating resource usage from migration eb5aabe1-cb88-4102-8f99-b6fe8e8e8562
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Starting to track incoming migration eb5aabe1-cb88-4102-8f99-b6fe8e8e8562 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.724 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating resource usage from migration c53386d3-3d97-4c78-a0cf-66ce9d67e567
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.725 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Starting to track incoming migration c53386d3-3d97-4c78-a0cf-66ce9d67e567 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.750 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 2e1c5d01-3310-41d8-8a6d-780b09f6bf06 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.751 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance d8ccd45c-e570-4b75-b836-a93e2de1818b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.751 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 0b759275-94f4-4c19-857f-f04aa6b32c6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:57:41 compute-1 nova_compute[187157]: 2025-12-02 23:57:41.752 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance e5b8a0f2-4b3a-4069-a535-5179df8ffa6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.017 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.113 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.259 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance d66a42a4-6bab-485d-a45f-0df43bf25d1b has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.367 187161 DEBUG nova.network.neutron [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:42 compute-1 sshd-session[211077]: Accepted publickey for nova from 192.168.122.100 port 56116 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:57:42 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Dec 02 23:57:42 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 02 23:57:42 compute-1 systemd-logind[790]: New session 33 of user nova.
Dec 02 23:57:42 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 02 23:57:42 compute-1 systemd[1]: Starting User Manager for UID 42436...
Dec 02 23:57:42 compute-1 systemd[211081]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.765 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 35a3db0d-2b6a-47be-bc85-4b164026935c has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.768 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 6 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.768 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1344MB phys_disk=79GB used_disk=6GB total_vcpus=8 used_vcpus=6 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:57:39 up  1:04,  0 user,  load average: 0.39, 0.34, 0.39\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '4', 'num_os_type_None': '4', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '4', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:57:42 compute-1 systemd[211081]: Queued start job for default target Main User Target.
Dec 02 23:57:42 compute-1 systemd[211081]: Created slice User Application Slice.
Dec 02 23:57:42 compute-1 systemd[211081]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 23:57:42 compute-1 systemd[211081]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 23:57:42 compute-1 systemd[211081]: Reached target Paths.
Dec 02 23:57:42 compute-1 systemd[211081]: Reached target Timers.
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.873 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:42 compute-1 systemd[211081]: Starting D-Bus User Message Bus Socket...
Dec 02 23:57:42 compute-1 systemd[211081]: Starting Create User's Volatile Files and Directories...
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.889 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d66a42a4-6bab-485d-a45f-0df43bf25d1b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.890 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Creating instance directory: /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 02 23:57:42 compute-1 systemd[211081]: Listening on D-Bus User Message Bus Socket.
Dec 02 23:57:42 compute-1 systemd[211081]: Reached target Sockets.
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.890 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Creating disk.info with the contents: {'/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk': 'qcow2', '/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.891 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.892 187161 DEBUG nova.objects.instance [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid d66a42a4-6bab-485d-a45f-0df43bf25d1b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:42 compute-1 systemd[211081]: Finished Create User's Volatile Files and Directories.
Dec 02 23:57:42 compute-1 systemd[211081]: Reached target Basic System.
Dec 02 23:57:42 compute-1 systemd[211081]: Reached target Main User Target.
Dec 02 23:57:42 compute-1 systemd[211081]: Startup finished in 179ms.
Dec 02 23:57:42 compute-1 systemd[1]: Started User Manager for UID 42436.
Dec 02 23:57:42 compute-1 systemd[1]: Started Session 33 of User nova.
Dec 02 23:57:42 compute-1 sshd-session[211077]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:57:42 compute-1 nova_compute[187157]: 2025-12-02 23:57:42.935 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:57:42 compute-1 sshd-session[211096]: Received disconnect from 192.168.122.100 port 56116:11: disconnected by user
Dec 02 23:57:42 compute-1 sshd-session[211096]: Disconnected from user nova 192.168.122.100 port 56116
Dec 02 23:57:42 compute-1 sshd-session[211077]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:57:42 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Dec 02 23:57:42 compute-1 systemd-logind[790]: Session 33 logged out. Waiting for processes to exit.
Dec 02 23:57:42 compute-1 systemd-logind[790]: Removed session 33.
Dec 02 23:57:43 compute-1 sshd-session[211098]: Accepted publickey for nova from 192.168.122.100 port 56118 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:57:43 compute-1 systemd-logind[790]: New session 35 of user nova.
Dec 02 23:57:43 compute-1 systemd[1]: Started Session 35 of User nova.
Dec 02 23:57:43 compute-1 sshd-session[211098]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:57:43 compute-1 sshd-session[211101]: Received disconnect from 192.168.122.100 port 56118:11: disconnected by user
Dec 02 23:57:43 compute-1 sshd-session[211101]: Disconnected from user nova 192.168.122.100 port 56118
Dec 02 23:57:43 compute-1 sshd-session[211098]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:57:43 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Dec 02 23:57:43 compute-1 systemd-logind[790]: Session 35 logged out. Waiting for processes to exit.
Dec 02 23:57:43 compute-1 systemd-logind[790]: Removed session 35.
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.261 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.399 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.407 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.410 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.443 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.499 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.500 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.501 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.502 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.510 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.510 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.605 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.606 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.648 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.650 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.651 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.743 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.744 187161 DEBUG nova.virt.disk.api [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.745 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.843 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.844 187161 DEBUG nova.virt.disk.api [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.845 187161 DEBUG nova.objects.instance [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid d66a42a4-6bab-485d-a45f-0df43bf25d1b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.962 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.963 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.788s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.963 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:43 compute-1 nova_compute[187157]: 2025-12-02 23:57:43.964 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 02 23:57:44 compute-1 podman[211118]: 2025-12-02 23:57:44.278680989 +0000 UTC m=+0.098074105 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.362 187161 DEBUG nova.objects.base [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<d66a42a4-6bab-485d-a45f-0df43bf25d1b> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.363 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.408 187161 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config 497664" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.410 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.413 187161 DEBUG nova.virt.libvirt.vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:55:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-116577734',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-116577734',id=6,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-dt7jcyvd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:56:04Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.414 187161 DEBUG nova.network.os_vif_util [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.416 187161 DEBUG nova.network.os_vif_util [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.417 187161 DEBUG os_vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.418 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.419 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.420 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.421 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.422 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f72fb5b5-47db-50dd-881f-4c2fc19fab24', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.424 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.426 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.431 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa1a4037-74, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.432 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapaa1a4037-74, col_values=(('qos', UUID('23765d3a-29f6-4182-b025-a79c422b34f9')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.433 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapaa1a4037-74, col_values=(('external_ids', {'iface-id': 'aa1a4037-7471-48e2-8297-5aeb45672ebb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:d7:48', 'vm-uuid': 'd66a42a4-6bab-485d-a45f-0df43bf25d1b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.434 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 NetworkManager[55553]: <info>  [1764719864.4361] manager: (tapaa1a4037-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.438 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.446 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.448 187161 INFO os_vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74')
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.449 187161 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.450 187161 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d66a42a4-6bab-485d-a45f-0df43bf25d1b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.452 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.471 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 02 23:57:44 compute-1 nova_compute[187157]: 2025-12-02 23:57:44.555 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.472 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.474 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.581 187161 DEBUG nova.compute.manager [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.581 187161 DEBUG oslo_concurrency.lockutils [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.582 187161 DEBUG oslo_concurrency.lockutils [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.582 187161 DEBUG oslo_concurrency.lockutils [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.582 187161 DEBUG nova.compute.manager [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.583 187161 WARNING nova.compute.manager [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state resize_migrating.
Dec 02 23:57:45 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:45.675 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.676 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:45.677 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.684 187161 DEBUG nova.network.neutron [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Port aa1a4037-7471-48e2-8297-5aeb45672ebb updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 02 23:57:45 compute-1 nova_compute[187157]: 2025-12-02 23:57:45.699 187161 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d66a42a4-6bab-485d-a45f-0df43bf25d1b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 02 23:57:46 compute-1 sshd-session[211146]: Accepted publickey for nova from 192.168.122.100 port 56124 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:57:46 compute-1 systemd-logind[790]: New session 36 of user nova.
Dec 02 23:57:46 compute-1 systemd[1]: Started Session 36 of User nova.
Dec 02 23:57:46 compute-1 sshd-session[211146]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.021 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:47 compute-1 sshd-session[211149]: Received disconnect from 192.168.122.100 port 56124:11: disconnected by user
Dec 02 23:57:47 compute-1 sshd-session[211149]: Disconnected from user nova 192.168.122.100 port 56124
Dec 02 23:57:47 compute-1 sshd-session[211146]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:57:47 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Dec 02 23:57:47 compute-1 systemd-logind[790]: Session 36 logged out. Waiting for processes to exit.
Dec 02 23:57:47 compute-1 systemd-logind[790]: Removed session 36.
Dec 02 23:57:47 compute-1 sshd-session[211151]: Accepted publickey for nova from 192.168.122.100 port 56128 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:57:47 compute-1 systemd-logind[790]: New session 37 of user nova.
Dec 02 23:57:47 compute-1 systemd[1]: Started Session 37 of User nova.
Dec 02 23:57:47 compute-1 sshd-session[211151]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:57:47 compute-1 sshd-session[211154]: Received disconnect from 192.168.122.100 port 56128:11: disconnected by user
Dec 02 23:57:47 compute-1 sshd-session[211154]: Disconnected from user nova 192.168.122.100 port 56128
Dec 02 23:57:47 compute-1 sshd-session[211151]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:57:47 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Dec 02 23:57:47 compute-1 systemd-logind[790]: Session 37 logged out. Waiting for processes to exit.
Dec 02 23:57:47 compute-1 systemd-logind[790]: Removed session 37.
Dec 02 23:57:47 compute-1 sshd-session[211156]: Accepted publickey for nova from 192.168.122.100 port 56142 ssh2: ECDSA SHA256:3AllEFUYW7uiMxyM2nTMuXWI0wJTJaAim9Lq1c5tGGQ
Dec 02 23:57:47 compute-1 systemd-logind[790]: New session 38 of user nova.
Dec 02 23:57:47 compute-1 systemd[1]: Started Session 38 of User nova.
Dec 02 23:57:47 compute-1 sshd-session[211156]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 02 23:57:47 compute-1 sshd-session[211160]: Received disconnect from 192.168.122.100 port 56142:11: disconnected by user
Dec 02 23:57:47 compute-1 sshd-session[211160]: Disconnected from user nova 192.168.122.100 port 56142
Dec 02 23:57:47 compute-1 sshd-session[211156]: pam_unix(sshd:session): session closed for user nova
Dec 02 23:57:47 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Dec 02 23:57:47 compute-1 systemd-logind[790]: Session 38 logged out. Waiting for processes to exit.
Dec 02 23:57:47 compute-1 systemd-logind[790]: Removed session 38.
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.718 187161 DEBUG nova.compute.manager [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.718 187161 DEBUG oslo_concurrency.lockutils [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.719 187161 DEBUG oslo_concurrency.lockutils [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.719 187161 DEBUG oslo_concurrency.lockutils [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.720 187161 DEBUG nova.compute.manager [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:47 compute-1 nova_compute[187157]: 2025-12-02 23:57:47.720 187161 WARNING nova.compute.manager [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state resize_migrating.
Dec 02 23:57:48 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 02 23:57:48 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 02 23:57:49 compute-1 kernel: tapaa1a4037-74: entered promiscuous mode
Dec 02 23:57:49 compute-1 NetworkManager[55553]: <info>  [1764719869.1408] manager: (tapaa1a4037-74): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Dec 02 23:57:49 compute-1 ovn_controller[95464]: 2025-12-02T23:57:49Z|00066|binding|INFO|Claiming lport aa1a4037-7471-48e2-8297-5aeb45672ebb for this additional chassis.
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.193 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:49 compute-1 ovn_controller[95464]: 2025-12-02T23:57:49Z|00067|binding|INFO|aa1a4037-7471-48e2-8297-5aeb45672ebb: Claiming fa:16:3e:fd:d7:48 10.100.0.12
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:49 compute-1 ovn_controller[95464]: 2025-12-02T23:57:49Z|00068|binding|INFO|Setting lport aa1a4037-7471-48e2-8297-5aeb45672ebb ovn-installed in OVS
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.213 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:49 compute-1 systemd-udevd[211193]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:57:49 compute-1 NetworkManager[55553]: <info>  [1764719869.2359] device (tapaa1a4037-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:57:49 compute-1 NetworkManager[55553]: <info>  [1764719869.2366] device (tapaa1a4037-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:57:49 compute-1 systemd-machined[153454]: New machine qemu-6-instance-00000006.
Dec 02 23:57:49 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.314 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:d7:48 10.100.0.12'], port_security=['fa:16:3e:fd:d7:48 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd66a42a4-6bab-485d-a45f-0df43bf25d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=aa1a4037-7471-48e2-8297-5aeb45672ebb) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.317 104348 INFO neutron.agent.ovn.metadata.agent [-] Port aa1a4037-7471-48e2-8297-5aeb45672ebb in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.319 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.346 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[409334f4-b32e-48c8-9b88-8533226e2e0c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.394 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[97bf0415-cbbe-46fa-850c-716ffa7c557b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.400 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[77a3c12a-12ca-4f1a-bf61-bd9d1b2fc3d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 podman[211199]: 2025-12-02 23:57:49.404449109 +0000 UTC m=+0.074183065 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: ERROR   23:57:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: ERROR   23:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: ERROR   23:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: ERROR   23:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: ERROR   23:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:57:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.435 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.451 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[53c74c04-e21a-4915-bf42-d454d82ef83e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.465 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cf7702c0-d2d3-44d0-85c3-5fcf552aeb9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211230, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.477 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1b590c55-dd3d-4390-bb63-721d1d5fd8de]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211231, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211231, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.479 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.481 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.482 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.482 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.482 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.483 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.483 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:49 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:49.484 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1dca47f5-07fe-48ab-bb8e-d4fd373fd7df]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:49 compute-1 nova_compute[187157]: 2025-12-02 23:57:49.976 187161 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:50 compute-1 nova_compute[187157]: 2025-12-02 23:57:50.218 187161 INFO nova.network.neutron [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating port 933e46ed-57a7-472a-adf9-eff09ae7c559 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.025 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.158 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.158 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.159 187161 DEBUG nova.network.neutron [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.310 187161 DEBUG nova.compute.manager [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-changed-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.311 187161 DEBUG nova.compute.manager [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Refreshing instance network info cache due to event network-changed-933e46ed-57a7-472a-adf9-eff09ae7c559. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.311 187161 DEBUG oslo_concurrency.lockutils [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:52 compute-1 ovn_controller[95464]: 2025-12-02T23:57:52Z|00069|binding|INFO|Claiming lport aa1a4037-7471-48e2-8297-5aeb45672ebb for this chassis.
Dec 02 23:57:52 compute-1 ovn_controller[95464]: 2025-12-02T23:57:52Z|00070|binding|INFO|aa1a4037-7471-48e2-8297-5aeb45672ebb: Claiming fa:16:3e:fd:d7:48 10.100.0.12
Dec 02 23:57:52 compute-1 ovn_controller[95464]: 2025-12-02T23:57:52Z|00071|binding|INFO|Setting lport aa1a4037-7471-48e2-8297-5aeb45672ebb up in Southbound
Dec 02 23:57:52 compute-1 nova_compute[187157]: 2025-12-02 23:57:52.667 187161 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.138 187161 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.405 187161 DEBUG nova.network.neutron [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.604 187161 INFO nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Post operation of migration started
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.604 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:53 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:53.679 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.912 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.916 187161 DEBUG oslo_concurrency.lockutils [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:53 compute-1 nova_compute[187157]: 2025-12-02 23:57:53.916 187161 DEBUG nova.network.neutron [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Refreshing network info cache for port 933e46ed-57a7-472a-adf9-eff09ae7c559 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.189 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.189 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.326 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.326 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.326 187161 DEBUG nova.network.neutron [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.428 187161 WARNING neutronclient.v2_0.client [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.441 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.481 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.484 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.485 187161 INFO nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Creating image(s)
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.486 187161 DEBUG nova.objects.instance [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.833 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:54 compute-1 nova_compute[187157]: 2025-12-02 23:57:54.996 187161 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.102 187161 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.103 187161 DEBUG nova.virt.disk.api [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.104 187161 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.199 187161 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.200 187161 DEBUG nova.virt.disk.api [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.532 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.709 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.710 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Ensure instance console log exists: /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.711 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.712 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.712 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.719 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Start _get_guest_xml network_info=[{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.727 187161 WARNING nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.730 187161 DEBUG nova.virt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-508367334', uuid='35a3db0d-2b6a-47be-bc85-4b164026935c'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719875.73019) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.736 187161 DEBUG nova.virt.libvirt.host [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.737 187161 DEBUG nova.virt.libvirt.host [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.740 187161 DEBUG nova.virt.libvirt.host [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.741 187161 DEBUG nova.virt.libvirt.host [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.743 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.743 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.744 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.745 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.745 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.745 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.746 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.746 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.747 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.747 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.747 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.748 187161 DEBUG nova.virt.hardware [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:57:55 compute-1 nova_compute[187157]: 2025-12-02 23:57:55.748 187161 DEBUG nova.objects.instance [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.164 187161 WARNING neutronclient.v2_0.client [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:56 compute-1 sshd-session[211253]: Invalid user sol from 193.32.162.146 port 35934
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.254 187161 DEBUG nova.objects.base [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<35a3db0d-2b6a-47be-bc85-4b164026935c> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.259 187161 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.275 187161 DEBUG nova.network.neutron [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:56 compute-1 sshd-session[211253]: Connection closed by invalid user sol 193.32.162.146 port 35934 [preauth]
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.320 187161 DEBUG nova.network.neutron [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updated VIF entry in instance network info cache for port 933e46ed-57a7-472a-adf9-eff09ae7c559. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.321 187161 DEBUG nova.network.neutron [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.350 187161 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.351 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.351 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.352 187161 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.354 187161 DEBUG nova.virt.libvirt.vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:57:48Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.354 187161 DEBUG nova.network.os_vif_util [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.355 187161 DEBUG nova.network.os_vif_util [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.358 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <uuid>35a3db0d-2b6a-47be-bc85-4b164026935c</uuid>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <name>instance-00000008</name>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <metadata>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-508367334</nova:name>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-02 23:57:55</nova:creationTime>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:properties>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_input_bus">usb</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_machine_type">q35</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_video_model">virtio</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:property name="hw_vif_model">virtio</nova:property>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         </nova:properties>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       </nova:image>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:owner>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       </nova:owner>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <nova:ports>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         <nova:port uuid="933e46ed-57a7-472a-adf9-eff09ae7c559">
Dec 02 23:57:56 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:         </nova:port>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       </nova:ports>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </nova:instance>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </metadata>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <system>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <entry name="serial">35a3db0d-2b6a-47be-bc85-4b164026935c</entry>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <entry name="uuid">35a3db0d-2b6a-47be-bc85-4b164026935c</entry>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </system>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </sysinfo>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <os>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </os>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <features>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <acpi/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <apic/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </features>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </clock>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </cpu>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   <devices>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </disk>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:2d:bc:aa"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <target dev="tap933e46ed-57"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </interface>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/console.log" append="off"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </serial>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <video>
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </video>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </rng>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:57:56 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 02 23:57:56 compute-1 nova_compute[187157]:     </memballoon>
Dec 02 23:57:56 compute-1 nova_compute[187157]:   </devices>
Dec 02 23:57:56 compute-1 nova_compute[187157]: </domain>
Dec 02 23:57:56 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.360 187161 DEBUG nova.virt.libvirt.vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:57:48Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.361 187161 DEBUG nova.network.os_vif_util [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.362 187161 DEBUG nova.network.os_vif_util [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.362 187161 DEBUG os_vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.364 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.364 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.365 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.366 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.366 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c58cc7f7-bcdb-594b-9781-ef98323345a6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.368 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.371 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.374 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.374 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap933e46ed-57, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.374 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap933e46ed-57, col_values=(('qos', UUID('91b69662-2e86-4af1-b784-ddda5a868f98')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.374 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap933e46ed-57, col_values=(('external_ids', {'iface-id': '933e46ed-57a7-472a-adf9-eff09ae7c559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:bc:aa', 'vm-uuid': '35a3db0d-2b6a-47be-bc85-4b164026935c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.376 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 NetworkManager[55553]: <info>  [1764719876.3770] manager: (tap933e46ed-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.378 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.384 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.385 187161 INFO os_vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57')
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.790 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:56 compute-1 nova_compute[187157]: 2025-12-02 23:57:56.828 187161 DEBUG oslo_concurrency.lockutils [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.030 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.317 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.318 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.319 187161 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.324 187161 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 02 23:57:57 compute-1 virtqemud[186882]: Domain id=6 name='instance-00000006' uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b is tainted: custom-monitor
Dec 02 23:57:57 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Dec 02 23:57:57 compute-1 systemd[211081]: Activating special unit Exit the Session...
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped target Main User Target.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped target Basic System.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped target Paths.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped target Sockets.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped target Timers.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 23:57:57 compute-1 systemd[211081]: Closed D-Bus User Message Bus Socket.
Dec 02 23:57:57 compute-1 systemd[211081]: Stopped Create User's Volatile Files and Directories.
Dec 02 23:57:57 compute-1 systemd[211081]: Removed slice User Application Slice.
Dec 02 23:57:57 compute-1 systemd[211081]: Reached target Shutdown.
Dec 02 23:57:57 compute-1 systemd[211081]: Finished Exit the Session.
Dec 02 23:57:57 compute-1 systemd[211081]: Reached target Exit the Session.
Dec 02 23:57:57 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Dec 02 23:57:57 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Dec 02 23:57:57 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 02 23:57:57 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 02 23:57:57 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 02 23:57:57 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 02 23:57:57 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Dec 02 23:57:57 compute-1 podman[211261]: 2025-12-02 23:57:57.768974521 +0000 UTC m=+0.091064658 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.942 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.943 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.943 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No VIF found with MAC fa:16:3e:2d:bc:aa, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:57:57 compute-1 nova_compute[187157]: 2025-12-02 23:57:57.945 187161 INFO nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Using config drive
Dec 02 23:57:58 compute-1 kernel: tap933e46ed-57: entered promiscuous mode
Dec 02 23:57:58 compute-1 NetworkManager[55553]: <info>  [1764719878.0373] manager: (tap933e46ed-57): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Dec 02 23:57:58 compute-1 ovn_controller[95464]: 2025-12-02T23:57:58Z|00072|binding|INFO|Claiming lport 933e46ed-57a7-472a-adf9-eff09ae7c559 for this chassis.
Dec 02 23:57:58 compute-1 ovn_controller[95464]: 2025-12-02T23:57:58Z|00073|binding|INFO|933e46ed-57a7-472a-adf9-eff09ae7c559: Claiming fa:16:3e:2d:bc:aa 10.100.0.9
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.074 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.077 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.083 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:bc:aa 10.100.0.9'], port_security=['fa:16:3e:2d:bc:aa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '35a3db0d-2b6a-47be-bc85-4b164026935c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=933e46ed-57a7-472a-adf9-eff09ae7c559) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.084 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 933e46ed-57a7-472a-adf9-eff09ae7c559 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.087 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:57:58 compute-1 systemd-udevd[211299]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:57:58 compute-1 ovn_controller[95464]: 2025-12-02T23:57:58Z|00074|binding|INFO|Setting lport 933e46ed-57a7-472a-adf9-eff09ae7c559 ovn-installed in OVS
Dec 02 23:57:58 compute-1 ovn_controller[95464]: 2025-12-02T23:57:58Z|00075|binding|INFO|Setting lport 933e46ed-57a7-472a-adf9-eff09ae7c559 up in Southbound
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.105 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.115 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[602795ff-68f3-4051-b494-6ab1cf69aac3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 NetworkManager[55553]: <info>  [1764719878.1233] device (tap933e46ed-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:57:58 compute-1 NetworkManager[55553]: <info>  [1764719878.1258] device (tap933e46ed-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:57:58 compute-1 systemd-machined[153454]: New machine qemu-7-instance-00000008.
Dec 02 23:57:58 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-00000008.
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.168 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7c02d8-72be-4be7-ac74-66f32a0e8e3f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.174 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[84f995e7-9397-4f69-81a9-237c27434a9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.225 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[3795a4b7-149f-4606-a454-25041f3e4edb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.262 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfa2ad3-e503-4cc9-bd4c-c58ffc6e912e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 24, 'tx_packets': 13, 'rx_bytes': 1288, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 24, 'tx_packets': 13, 'rx_bytes': 1288, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211314, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.295 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ca006610-a0fd-46ae-9be0-1b826cbd4549]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211316, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211316, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.297 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.301 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.301 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.301 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.302 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.302 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:57:58.304 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8832a8-4375-4db8-86d9-4921dd1f420a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.335 187161 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.361 187161 DEBUG nova.compute.manager [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.361 187161 DEBUG oslo_concurrency.lockutils [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.361 187161 DEBUG oslo_concurrency.lockutils [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.362 187161 DEBUG oslo_concurrency.lockutils [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.362 187161 DEBUG nova.compute.manager [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:58 compute-1 nova_compute[187157]: 2025-12-02 23:57:58.362 187161 WARNING nova.compute.manager [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state resize_finish.
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.203 187161 DEBUG nova.compute.manager [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.209 187161 INFO nova.virt.libvirt.driver [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance running successfully.
Dec 02 23:57:59 compute-1 virtqemud[186882]: argument unsupported: QEMU guest agent is not configured
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.212 187161 DEBUG nova.virt.libvirt.guest [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.212 187161 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.344 187161 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.350 187161 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:57:59 compute-1 nova_compute[187157]: 2025-12-02 23:57:59.861 187161 DEBUG nova.objects.instance [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.432 187161 DEBUG nova.compute.manager [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.434 187161 DEBUG oslo_concurrency.lockutils [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.435 187161 DEBUG oslo_concurrency.lockutils [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.436 187161 DEBUG oslo_concurrency.lockutils [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.436 187161 DEBUG nova.compute.manager [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.437 187161 WARNING nova.compute.manager [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state resized and task_state None.
Dec 02 23:58:00 compute-1 nova_compute[187157]: 2025-12-02 23:58:00.885 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:01 compute-1 nova_compute[187157]: 2025-12-02 23:58:01.182 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:01 compute-1 nova_compute[187157]: 2025-12-02 23:58:01.185 187161 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:01 compute-1 nova_compute[187157]: 2025-12-02 23:58:01.378 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:01.706 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:01.706 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:01.707 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:02 compute-1 nova_compute[187157]: 2025-12-02 23:58:02.034 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:02 compute-1 podman[211329]: 2025-12-02 23:58:02.321164529 +0000 UTC m=+0.127071779 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 23:58:04 compute-1 podman[211355]: 2025-12-02 23:58:04.259760976 +0000 UTC m=+0.094200402 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 02 23:58:05 compute-1 podman[197537]: time="2025-12-02T23:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:58:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:58:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Dec 02 23:58:06 compute-1 nova_compute[187157]: 2025-12-02 23:58:06.381 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:07 compute-1 nova_compute[187157]: 2025-12-02 23:58:07.036 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:11 compute-1 nova_compute[187157]: 2025-12-02 23:58:11.383 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:12 compute-1 nova_compute[187157]: 2025-12-02 23:58:12.069 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:12 compute-1 ovn_controller[95464]: 2025-12-02T23:58:12Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:bc:aa 10.100.0.9
Dec 02 23:58:15 compute-1 podman[211383]: 2025-12-02 23:58:15.202074728 +0000 UTC m=+0.048312246 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350)
Dec 02 23:58:16 compute-1 nova_compute[187157]: 2025-12-02 23:58:16.387 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.072 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.875 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.876 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.876 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.877 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.877 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:17 compute-1 nova_compute[187157]: 2025-12-02 23:58:17.895 187161 INFO nova.compute.manager [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Terminating instance
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.414 187161 DEBUG nova.compute.manager [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:58:18 compute-1 kernel: tap3ff98f13-ac (unregistering): left promiscuous mode
Dec 02 23:58:18 compute-1 NetworkManager[55553]: <info>  [1764719898.4497] device (tap3ff98f13-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.456 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:18 compute-1 ovn_controller[95464]: 2025-12-02T23:58:18Z|00076|binding|INFO|Releasing lport 3ff98f13-ac75-44a9-b36f-3c729c73fc57 from this chassis (sb_readonly=0)
Dec 02 23:58:18 compute-1 ovn_controller[95464]: 2025-12-02T23:58:18Z|00077|binding|INFO|Setting lport 3ff98f13-ac75-44a9-b36f-3c729c73fc57 down in Southbound
Dec 02 23:58:18 compute-1 ovn_controller[95464]: 2025-12-02T23:58:18Z|00078|binding|INFO|Removing iface tap3ff98f13-ac ovn-installed in OVS
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.460 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.468 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:18:82 10.100.0.3'], port_security=['fa:16:3e:e0:18:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e5b8a0f2-4b3a-4069-a535-5179df8ffa6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=3ff98f13-ac75-44a9-b36f-3c729c73fc57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.470 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 3ff98f13-ac75-44a9-b36f-3c729c73fc57 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.472 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.492 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.492 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b95b4114-fa47-461b-92e8-cbe087df509b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 02 23:58:18 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 15.513s CPU time.
Dec 02 23:58:18 compute-1 systemd-machined[153454]: Machine qemu-5-instance-00000009 terminated.
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.545 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[682712f1-5749-421f-b429-8fe4b8cefe53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.548 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[91e08128-54c6-4d56-9508-8ac076730c36]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.592 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[cdfff6a2-2663-482f-8022-fbe79ad4c13d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.618 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4a477be1-601a-4eb6-b24e-8b33cb77660e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 35, 'tx_packets': 15, 'rx_bytes': 1750, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 35, 'tx_packets': 15, 'rx_bytes': 1750, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 16394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211418, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.644 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[82cd8140-fed1-4c18-ac09-c3f88df35ad8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211420, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211420, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.646 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.648 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.665 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.665 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.666 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.666 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.667 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:18 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:18.668 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8310ddf1-cd40-4aa9-8c43-4bafd762d21c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.680 187161 DEBUG nova.compute.manager [req-18d6acb8-b7ab-4a94-986c-e35380971890 req-d681369c-72e6-4526-bc94-1f1bcbd6a063 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-unplugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.680 187161 DEBUG oslo_concurrency.lockutils [req-18d6acb8-b7ab-4a94-986c-e35380971890 req-d681369c-72e6-4526-bc94-1f1bcbd6a063 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.681 187161 DEBUG oslo_concurrency.lockutils [req-18d6acb8-b7ab-4a94-986c-e35380971890 req-d681369c-72e6-4526-bc94-1f1bcbd6a063 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.681 187161 DEBUG oslo_concurrency.lockutils [req-18d6acb8-b7ab-4a94-986c-e35380971890 req-d681369c-72e6-4526-bc94-1f1bcbd6a063 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.681 187161 DEBUG nova.compute.manager [req-18d6acb8-b7ab-4a94-986c-e35380971890 req-d681369c-72e6-4526-bc94-1f1bcbd6a063 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] No waiting events found dispatching network-vif-unplugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.682 187161 DEBUG nova.compute.manager [req-18d6acb8-b7ab-4a94-986c-e35380971890 req-d681369c-72e6-4526-bc94-1f1bcbd6a063 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-unplugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.706 187161 INFO nova.virt.libvirt.driver [-] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Instance destroyed successfully.
Dec 02 23:58:18 compute-1 nova_compute[187157]: 2025-12-02 23:58:18.707 187161 DEBUG nova.objects.instance [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'resources' on Instance uuid e5b8a0f2-4b3a-4069-a535-5179df8ffa6a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.215 187161 DEBUG nova.virt.libvirt.vif [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:56:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-415186711',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-415186711',id=9,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:57:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-eb36m3sp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:57:18Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=e5b8a0f2-4b3a-4069-a535-5179df8ffa6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.216 187161 DEBUG nova.network.os_vif_util [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "address": "fa:16:3e:e0:18:82", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ff98f13-ac", "ovs_interfaceid": "3ff98f13-ac75-44a9-b36f-3c729c73fc57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.219 187161 DEBUG nova.network.os_vif_util [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.220 187161 DEBUG os_vif [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.224 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.225 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ff98f13-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.228 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.231 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.232 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.234 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.234 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1d00ae94-1d76-4674-a0c9-0bde7c692c22) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.236 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.237 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.240 187161 INFO os_vif [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:18:82,bridge_name='br-int',has_traffic_filtering=True,id=3ff98f13-ac75-44a9-b36f-3c729c73fc57,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ff98f13-ac')
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.241 187161 INFO nova.virt.libvirt.driver [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Deleting instance files /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a_del
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.243 187161 INFO nova.virt.libvirt.driver [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Deletion of /var/lib/nova/instances/e5b8a0f2-4b3a-4069-a535-5179df8ffa6a_del complete
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: ERROR   23:58:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: ERROR   23:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: ERROR   23:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: ERROR   23:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: ERROR   23:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:58:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.757 187161 INFO nova.compute.manager [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Took 1.34 seconds to destroy the instance on the hypervisor.
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.757 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.758 187161 DEBUG nova.compute.manager [-] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.758 187161 DEBUG nova.network.neutron [-] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:58:19 compute-1 nova_compute[187157]: 2025-12-02 23:58:19.758 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.192 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:20 compute-1 podman[211438]: 2025-12-02 23:58:20.281428048 +0000 UTC m=+0.102676125 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.767 187161 DEBUG nova.compute.manager [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-unplugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.767 187161 DEBUG oslo_concurrency.lockutils [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.768 187161 DEBUG oslo_concurrency.lockutils [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.768 187161 DEBUG oslo_concurrency.lockutils [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.769 187161 DEBUG nova.compute.manager [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] No waiting events found dispatching network-vif-unplugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.769 187161 DEBUG nova.compute.manager [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-unplugged-3ff98f13-ac75-44a9-b36f-3c729c73fc57 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.770 187161 DEBUG nova.compute.manager [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Received event network-vif-deleted-3ff98f13-ac75-44a9-b36f-3c729c73fc57 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.770 187161 INFO nova.compute.manager [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Neutron deleted interface 3ff98f13-ac75-44a9-b36f-3c729c73fc57; detaching it from the instance and deleting it from the info cache
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.770 187161 DEBUG nova.network.neutron [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:20 compute-1 nova_compute[187157]: 2025-12-02 23:58:20.998 187161 DEBUG nova.network.neutron [-] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:21 compute-1 nova_compute[187157]: 2025-12-02 23:58:21.282 187161 DEBUG nova.compute.manager [req-13ad19a1-b824-423f-a9b0-4a7587e883c2 req-a4cfb468-b267-4b70-b0a0-76dbf9e2cf53 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Detach interface failed, port_id=3ff98f13-ac75-44a9-b36f-3c729c73fc57, reason: Instance e5b8a0f2-4b3a-4069-a535-5179df8ffa6a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 02 23:58:21 compute-1 nova_compute[187157]: 2025-12-02 23:58:21.505 187161 INFO nova.compute.manager [-] [instance: e5b8a0f2-4b3a-4069-a535-5179df8ffa6a] Took 1.75 seconds to deallocate network for instance.
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.027 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.028 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.077 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.084 187161 DEBUG nova.scheduler.client.report [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.107 187161 DEBUG nova.scheduler.client.report [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.108 187161 DEBUG nova.compute.provider_tree [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.127 187161 DEBUG nova.scheduler.client.report [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.157 187161 DEBUG nova.scheduler.client.report [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.322 187161 DEBUG nova.compute.provider_tree [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:22 compute-1 nova_compute[187157]: 2025-12-02 23:58:22.837 187161 DEBUG nova.scheduler.client.report [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:23 compute-1 nova_compute[187157]: 2025-12-02 23:58:23.351 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.323s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:23 compute-1 nova_compute[187157]: 2025-12-02 23:58:23.381 187161 INFO nova.scheduler.client.report [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Deleted allocations for instance e5b8a0f2-4b3a-4069-a535-5179df8ffa6a
Dec 02 23:58:24 compute-1 nova_compute[187157]: 2025-12-02 23:58:24.237 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:24 compute-1 nova_compute[187157]: 2025-12-02 23:58:24.420 187161 DEBUG oslo_concurrency.lockutils [None req-4975f27b-3f34-4aee-8a1f-df69ac20654c d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "e5b8a0f2-4b3a-4069-a535-5179df8ffa6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.545s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.244 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.245 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.246 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.246 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.246 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.264 187161 INFO nova.compute.manager [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Terminating instance
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.789 187161 DEBUG nova.compute.manager [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:58:25 compute-1 kernel: tap933e46ed-57 (unregistering): left promiscuous mode
Dec 02 23:58:25 compute-1 NetworkManager[55553]: <info>  [1764719905.8214] device (tap933e46ed-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:58:25 compute-1 ovn_controller[95464]: 2025-12-02T23:58:25Z|00079|binding|INFO|Releasing lport 933e46ed-57a7-472a-adf9-eff09ae7c559 from this chassis (sb_readonly=0)
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.831 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:25 compute-1 ovn_controller[95464]: 2025-12-02T23:58:25Z|00080|binding|INFO|Setting lport 933e46ed-57a7-472a-adf9-eff09ae7c559 down in Southbound
Dec 02 23:58:25 compute-1 ovn_controller[95464]: 2025-12-02T23:58:25Z|00081|binding|INFO|Removing iface tap933e46ed-57 ovn-installed in OVS
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.834 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.847 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:bc:aa 10.100.0.9'], port_security=['fa:16:3e:2d:bc:aa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '35a3db0d-2b6a-47be-bc85-4b164026935c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=933e46ed-57a7-472a-adf9-eff09ae7c559) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:25 compute-1 nova_compute[187157]: 2025-12-02 23:58:25.847 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.849 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 933e46ed-57a7-472a-adf9-eff09ae7c559 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.855 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.890 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[282ea307-498a-4063-aff2-ea81fbc5504f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:25 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 02 23:58:25 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Consumed 13.417s CPU time.
Dec 02 23:58:25 compute-1 systemd-machined[153454]: Machine qemu-7-instance-00000008 terminated.
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.938 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8f77b82e-6f9a-4bbe-a04d-e3434970c7fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.942 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8d060a8b-6c40-40d7-a752-7c9967a9e4b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:25 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:25.996 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a48e7f-1e99-4f7d-b666-f72bff6ecb63]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.023 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[47a04be0-2d13-4408-93eb-c299df307c5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 25608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211474, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.043 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4e48c8-0562-4ddc-965f-3e26b1addeef]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211486, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211486, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.045 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.048 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.053 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.054 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.054 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.055 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.055 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:26 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:26.057 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ba451870-8d07-430a-9c18-c4637aaf2582]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.058 187161 INFO nova.virt.libvirt.driver [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance destroyed successfully.
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.059 187161 DEBUG nova.objects.instance [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'resources' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.274 187161 DEBUG nova.compute.manager [req-7f841ba9-f46a-4dce-ad01-1c8bb2231e6c req-7b962755-b278-4cb0-ae1b-86b1eb54718a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.275 187161 DEBUG oslo_concurrency.lockutils [req-7f841ba9-f46a-4dce-ad01-1c8bb2231e6c req-7b962755-b278-4cb0-ae1b-86b1eb54718a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.275 187161 DEBUG oslo_concurrency.lockutils [req-7f841ba9-f46a-4dce-ad01-1c8bb2231e6c req-7b962755-b278-4cb0-ae1b-86b1eb54718a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.276 187161 DEBUG oslo_concurrency.lockutils [req-7f841ba9-f46a-4dce-ad01-1c8bb2231e6c req-7b962755-b278-4cb0-ae1b-86b1eb54718a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.276 187161 DEBUG nova.compute.manager [req-7f841ba9-f46a-4dce-ad01-1c8bb2231e6c req-7b962755-b278-4cb0-ae1b-86b1eb54718a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.276 187161 DEBUG nova.compute.manager [req-7f841ba9-f46a-4dce-ad01-1c8bb2231e6c req-7b962755-b278-4cb0-ae1b-86b1eb54718a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.565 187161 DEBUG nova.virt.libvirt.vif [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:57:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:58:12Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.566 187161 DEBUG nova.network.os_vif_util [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.567 187161 DEBUG nova.network.os_vif_util [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.567 187161 DEBUG os_vif [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.570 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.570 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap933e46ed-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.572 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.576 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.577 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.577 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=91b69662-2e86-4af1-b784-ddda5a868f98) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.579 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.581 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.584 187161 INFO os_vif [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57')
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.585 187161 INFO nova.virt.libvirt.driver [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Deleting instance files /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_del
Dec 02 23:58:26 compute-1 nova_compute[187157]: 2025-12-02 23:58:26.590 187161 INFO nova.virt.libvirt.driver [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Deletion of /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_del complete
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.080 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.107 187161 INFO nova.compute.manager [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Took 1.32 seconds to destroy the instance on the hypervisor.
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.107 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.108 187161 DEBUG nova.compute.manager [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.108 187161 DEBUG nova.network.neutron [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.109 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:27 compute-1 nova_compute[187157]: 2025-12-02 23:58:27.898 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:28 compute-1 podman[211495]: 2025-12-02 23:58:28.276518581 +0000 UTC m=+0.089094081 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.352 187161 DEBUG nova.compute.manager [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.353 187161 DEBUG oslo_concurrency.lockutils [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.353 187161 DEBUG oslo_concurrency.lockutils [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.353 187161 DEBUG oslo_concurrency.lockutils [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.353 187161 DEBUG nova.compute.manager [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.353 187161 DEBUG nova.compute.manager [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.354 187161 DEBUG nova.compute.manager [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-deleted-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.354 187161 INFO nova.compute.manager [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Neutron deleted interface 933e46ed-57a7-472a-adf9-eff09ae7c559; detaching it from the instance and deleting it from the info cache
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.354 187161 DEBUG nova.network.neutron [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.703 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.708 187161 DEBUG nova.network.neutron [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:28 compute-1 nova_compute[187157]: 2025-12-02 23:58:28.862 187161 DEBUG nova.compute.manager [req-b4a87495-5da4-45c2-a215-fc4cdfb6c783 req-e5b63083-6fb5-4486-a046-d48af747f312 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Detach interface failed, port_id=933e46ed-57a7-472a-adf9-eff09ae7c559, reason: Instance 35a3db0d-2b6a-47be-bc85-4b164026935c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 02 23:58:28 compute-1 sshd-session[211493]: Received disconnect from 193.46.255.159 port 26366:11:  [preauth]
Dec 02 23:58:28 compute-1 sshd-session[211493]: Disconnected from authenticating user root 193.46.255.159 port 26366 [preauth]
Dec 02 23:58:29 compute-1 nova_compute[187157]: 2025-12-02 23:58:29.214 187161 INFO nova.compute.manager [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Took 2.11 seconds to deallocate network for instance.
Dec 02 23:58:29 compute-1 nova_compute[187157]: 2025-12-02 23:58:29.735 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:29 compute-1 nova_compute[187157]: 2025-12-02 23:58:29.736 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:29 compute-1 nova_compute[187157]: 2025-12-02 23:58:29.742 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:29 compute-1 nova_compute[187157]: 2025-12-02 23:58:29.791 187161 INFO nova.scheduler.client.report [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Deleted allocations for instance 35a3db0d-2b6a-47be-bc85-4b164026935c
Dec 02 23:58:30 compute-1 nova_compute[187157]: 2025-12-02 23:58:30.823 187161 DEBUG oslo_concurrency.lockutils [None req-032aa5d4-a293-45e2-b1c9-b8d48b520443 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.578s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:31 compute-1 nova_compute[187157]: 2025-12-02 23:58:31.580 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.081 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.245 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.245 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.246 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.246 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.247 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.260 187161 INFO nova.compute.manager [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Terminating instance
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.779 187161 DEBUG nova.compute.manager [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:58:32 compute-1 kernel: tape54f1a66-ed (unregistering): left promiscuous mode
Dec 02 23:58:32 compute-1 NetworkManager[55553]: <info>  [1764719912.8166] device (tape54f1a66-ed): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:58:32 compute-1 ovn_controller[95464]: 2025-12-02T23:58:32Z|00082|binding|INFO|Releasing lport e54f1a66-edd4-4c1f-ae52-8de4515e4d18 from this chassis (sb_readonly=0)
Dec 02 23:58:32 compute-1 ovn_controller[95464]: 2025-12-02T23:58:32Z|00083|binding|INFO|Setting lport e54f1a66-edd4-4c1f-ae52-8de4515e4d18 down in Southbound
Dec 02 23:58:32 compute-1 ovn_controller[95464]: 2025-12-02T23:58:32Z|00084|binding|INFO|Removing iface tape54f1a66-ed ovn-installed in OVS
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.831 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.841 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:42:8f 10.100.0.5'], port_security=['fa:16:3e:d1:42:8f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0b759275-94f4-4c19-857f-f04aa6b32c6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=e54f1a66-edd4-4c1f-ae52-8de4515e4d18) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.842 104348 INFO neutron.agent.ovn.metadata.agent [-] Port e54f1a66-edd4-4c1f-ae52-8de4515e4d18 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.843 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.858 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9978da-e4ee-475f-a82d-5d86eea8a5e8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.859 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 02 23:58:32 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 18.494s CPU time.
Dec 02 23:58:32 compute-1 systemd-machined[153454]: Machine qemu-4-instance-00000007 terminated.
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.893 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[831e76f6-fdeb-463f-a385-5d22c9d6f7f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.895 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd21614-6e0a-4393-8d81-63675fa98027]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.934 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[870ea804-8441-41b8-994f-7a47f2acc6fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.959 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[13bf147e-8d25-4965-a6da-99632eb016ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 25608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211550, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:32 compute-1 podman[211521]: 2025-12-02 23:58:32.97344678 +0000 UTC m=+0.124776664 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller)
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.988 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[49937cb2-df13-4a17-b0ce-1dcff989e3b4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211557, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211557, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.989 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.991 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-1 nova_compute[187157]: 2025-12-02 23:58:32.996 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.997 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.997 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.997 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:32 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.998 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:33 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:32.999 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9939b4d9-f722-44b0-9c6f-33343e91d8a5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.063 187161 INFO nova.virt.libvirt.driver [-] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Instance destroyed successfully.
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.064 187161 DEBUG nova.objects.instance [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'resources' on Instance uuid 0b759275-94f4-4c19-857f-f04aa6b32c6a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.337 187161 DEBUG nova.compute.manager [req-0329f59d-bdd8-4170-92f1-0ccaee6ef4f5 req-0dda49fc-13bf-452e-a630-1ab92520cd76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-unplugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.338 187161 DEBUG oslo_concurrency.lockutils [req-0329f59d-bdd8-4170-92f1-0ccaee6ef4f5 req-0dda49fc-13bf-452e-a630-1ab92520cd76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.338 187161 DEBUG oslo_concurrency.lockutils [req-0329f59d-bdd8-4170-92f1-0ccaee6ef4f5 req-0dda49fc-13bf-452e-a630-1ab92520cd76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.338 187161 DEBUG oslo_concurrency.lockutils [req-0329f59d-bdd8-4170-92f1-0ccaee6ef4f5 req-0dda49fc-13bf-452e-a630-1ab92520cd76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.339 187161 DEBUG nova.compute.manager [req-0329f59d-bdd8-4170-92f1-0ccaee6ef4f5 req-0dda49fc-13bf-452e-a630-1ab92520cd76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] No waiting events found dispatching network-vif-unplugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.339 187161 DEBUG nova.compute.manager [req-0329f59d-bdd8-4170-92f1-0ccaee6ef4f5 req-0dda49fc-13bf-452e-a630-1ab92520cd76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-unplugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.572 187161 DEBUG nova.virt.libvirt.vif [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:56:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1322206593',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1322206593',id=7,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-3b9yeltp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:56:28Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=0b759275-94f4-4c19-857f-f04aa6b32c6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.573 187161 DEBUG nova.network.os_vif_util [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "address": "fa:16:3e:d1:42:8f", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape54f1a66-ed", "ovs_interfaceid": "e54f1a66-edd4-4c1f-ae52-8de4515e4d18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.574 187161 DEBUG nova.network.os_vif_util [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.575 187161 DEBUG os_vif [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.578 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.578 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape54f1a66-ed, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.625 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.628 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.630 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.630 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b6c62ba6-223f-470e-96c7-fcd90837f806) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.631 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.651 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.654 187161 INFO os_vif [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:42:8f,bridge_name='br-int',has_traffic_filtering=True,id=e54f1a66-edd4-4c1f-ae52-8de4515e4d18,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape54f1a66-ed')
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.656 187161 INFO nova.virt.libvirt.driver [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Deleting instance files /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a_del
Dec 02 23:58:33 compute-1 nova_compute[187157]: 2025-12-02 23:58:33.658 187161 INFO nova.virt.libvirt.driver [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Deletion of /var/lib/nova/instances/0b759275-94f4-4c19-857f-f04aa6b32c6a_del complete
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.175 187161 INFO nova.compute.manager [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Took 1.39 seconds to destroy the instance on the hypervisor.
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.176 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.177 187161 DEBUG nova.compute.manager [-] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.177 187161 DEBUG nova.network.neutron [-] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.178 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.389 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.697 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:34 compute-1 nova_compute[187157]: 2025-12-02 23:58:34.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.243 187161 DEBUG nova.network.neutron [-] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:35 compute-1 podman[211574]: 2025-12-02 23:58:35.262224678 +0000 UTC m=+0.087606806 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.460 187161 DEBUG nova.compute.manager [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-unplugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.460 187161 DEBUG oslo_concurrency.lockutils [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.461 187161 DEBUG oslo_concurrency.lockutils [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.461 187161 DEBUG oslo_concurrency.lockutils [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.461 187161 DEBUG nova.compute.manager [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] No waiting events found dispatching network-vif-unplugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.461 187161 DEBUG nova.compute.manager [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-unplugged-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.461 187161 DEBUG nova.compute.manager [req-9c846846-50b4-40ed-b6d9-64c35b8a7028 req-ececfd6b-76a8-4d6e-82a5-7f7aa8951988 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Received event network-vif-deleted-e54f1a66-edd4-4c1f-ae52-8de4515e4d18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:35 compute-1 podman[197537]: time="2025-12-02T23:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:58:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:58:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Dec 02 23:58:35 compute-1 nova_compute[187157]: 2025-12-02 23:58:35.749 187161 INFO nova.compute.manager [-] [instance: 0b759275-94f4-4c19-857f-f04aa6b32c6a] Took 1.57 seconds to deallocate network for instance.
Dec 02 23:58:36 compute-1 nova_compute[187157]: 2025-12-02 23:58:36.270 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:36 compute-1 nova_compute[187157]: 2025-12-02 23:58:36.272 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:36 compute-1 nova_compute[187157]: 2025-12-02 23:58:36.511 187161 DEBUG nova.compute.provider_tree [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:37 compute-1 nova_compute[187157]: 2025-12-02 23:58:37.021 187161 DEBUG nova.scheduler.client.report [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:37 compute-1 nova_compute[187157]: 2025-12-02 23:58:37.084 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:37 compute-1 nova_compute[187157]: 2025-12-02 23:58:37.533 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.261s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:37 compute-1 nova_compute[187157]: 2025-12-02 23:58:37.561 187161 INFO nova.scheduler.client.report [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Deleted allocations for instance 0b759275-94f4-4c19-857f-f04aa6b32c6a
Dec 02 23:58:37 compute-1 nova_compute[187157]: 2025-12-02 23:58:37.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:38 compute-1 nova_compute[187157]: 2025-12-02 23:58:38.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:38 compute-1 nova_compute[187157]: 2025-12-02 23:58:38.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:38 compute-1 nova_compute[187157]: 2025-12-02 23:58:38.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:38 compute-1 nova_compute[187157]: 2025-12-02 23:58:38.212 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:58:38 compute-1 nova_compute[187157]: 2025-12-02 23:58:38.591 187161 DEBUG oslo_concurrency.lockutils [None req-2ceabf67-ac39-4e18-8f00-55122dbc6261 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "0b759275-94f4-4c19-857f-f04aa6b32c6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.345s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:38 compute-1 nova_compute[187157]: 2025-12-02 23:58:38.632 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.296 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.364 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.366 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.366 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.367 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.367 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.385 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.386 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.397 187161 INFO nova.compute.manager [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Terminating instance
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.441 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.449 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.524 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.525 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.576 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.588 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.649 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.650 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.708 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.922 187161 DEBUG nova.compute.manager [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.929 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.930 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:39 compute-1 kernel: tapaa1a4037-74 (unregistering): left promiscuous mode
Dec 02 23:58:39 compute-1 NetworkManager[55553]: <info>  [1764719919.9465] device (tapaa1a4037-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:58:39 compute-1 ovn_controller[95464]: 2025-12-02T23:58:39Z|00085|binding|INFO|Releasing lport aa1a4037-7471-48e2-8297-5aeb45672ebb from this chassis (sb_readonly=0)
Dec 02 23:58:39 compute-1 ovn_controller[95464]: 2025-12-02T23:58:39Z|00086|binding|INFO|Setting lport aa1a4037-7471-48e2-8297-5aeb45672ebb down in Southbound
Dec 02 23:58:39 compute-1 ovn_controller[95464]: 2025-12-02T23:58:39Z|00087|binding|INFO|Removing iface tapaa1a4037-74 ovn-installed in OVS
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.959 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.963 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.965 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5386MB free_disk=73.08254623413086GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.965 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.965 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:39 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:39.977 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:d7:48 10.100.0.12'], port_security=['fa:16:3e:fd:d7:48 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd66a42a4-6bab-485d-a45f-0df43bf25d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=aa1a4037-7471-48e2-8297-5aeb45672ebb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:39 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:39.978 104348 INFO neutron.agent.ovn.metadata.agent [-] Port aa1a4037-7471-48e2-8297-5aeb45672ebb in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:39 compute-1 nova_compute[187157]: 2025-12-02 23:58:39.978 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:39 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:39.979 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:58:39 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:39.996 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8104f717-510c-42a4-ad6b-e8099b50683e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 02 23:58:40 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 4.288s CPU time.
Dec 02 23:58:40 compute-1 systemd-machined[153454]: Machine qemu-6-instance-00000006 terminated.
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.027 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[c76f65b2-8627-49a1-9277-2a0c682f6023]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.030 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[7e536e6f-31cf-47a8-ac81-54ed9aac2d62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.056 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ea35c320-4350-46e9-801a-a5053b949431]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.070 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0b4972-4d38-4a58-9bb9-fc86e3fcbb3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 25608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211625, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.081 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[36c6a505-1004-4d26-a421-5e42f698e3e6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211626, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211626, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.083 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.085 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.088 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.089 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.089 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.090 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.090 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:40 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:40.091 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a31b0e58-4e9d-446f-b870-689cf2323028]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.117 187161 DEBUG nova.compute.manager [req-0a6248f8-0ce7-4b7d-a5a0-fd09c3a6ac8e req-0b1c47b4-5b39-4fdf-8e68-4ee15dd1a74e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.118 187161 DEBUG oslo_concurrency.lockutils [req-0a6248f8-0ce7-4b7d-a5a0-fd09c3a6ac8e req-0b1c47b4-5b39-4fdf-8e68-4ee15dd1a74e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.118 187161 DEBUG oslo_concurrency.lockutils [req-0a6248f8-0ce7-4b7d-a5a0-fd09c3a6ac8e req-0b1c47b4-5b39-4fdf-8e68-4ee15dd1a74e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.118 187161 DEBUG oslo_concurrency.lockutils [req-0a6248f8-0ce7-4b7d-a5a0-fd09c3a6ac8e req-0b1c47b4-5b39-4fdf-8e68-4ee15dd1a74e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.119 187161 DEBUG nova.compute.manager [req-0a6248f8-0ce7-4b7d-a5a0-fd09c3a6ac8e req-0b1c47b4-5b39-4fdf-8e68-4ee15dd1a74e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.119 187161 DEBUG nova.compute.manager [req-0a6248f8-0ce7-4b7d-a5a0-fd09c3a6ac8e req-0b1c47b4-5b39-4fdf-8e68-4ee15dd1a74e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.172 187161 INFO nova.virt.libvirt.driver [-] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance destroyed successfully.
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.173 187161 DEBUG nova.objects.instance [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'resources' on Instance uuid d66a42a4-6bab-485d-a45f-0df43bf25d1b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.679 187161 DEBUG nova.virt.libvirt.vif [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-02T23:55:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-116577734',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-116577734',id=6,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-dt7jcyvd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:58:00Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.680 187161 DEBUG nova.network.os_vif_util [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.680 187161 DEBUG nova.network.os_vif_util [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.681 187161 DEBUG os_vif [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.683 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.683 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa1a4037-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.687 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.688 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.688 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=23765d3a-29f6-4182-b025-a79c422b34f9) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.689 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.690 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.692 187161 INFO os_vif [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74')
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.693 187161 INFO nova.virt.libvirt.driver [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Deleting instance files /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b_del
Dec 02 23:58:40 compute-1 nova_compute[187157]: 2025-12-02 23:58:40.693 187161 INFO nova.virt.libvirt.driver [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Deletion of /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b_del complete
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.018 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 2e1c5d01-3310-41d8-8a6d-780b09f6bf06 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.018 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance d8ccd45c-e570-4b75-b836-a93e2de1818b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.018 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance d66a42a4-6bab-485d-a45f-0df43bf25d1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.019 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.019 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:58:39 up  1:05,  0 user,  load average: 0.38, 0.35, 0.39\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '2', 'num_os_type_None': '3', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '3', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.112 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.207 187161 INFO nova.compute.manager [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Took 1.28 seconds to destroy the instance on the hypervisor.
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.208 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.208 187161 DEBUG nova.compute.manager [-] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.209 187161 DEBUG nova.network.neutron [-] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.209 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:41 compute-1 nova_compute[187157]: 2025-12-02 23:58:41.621 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.086 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.128 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.128 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.163s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.207 187161 DEBUG nova.compute.manager [req-cc96417b-b70c-4483-90a8-b26aee2b7f0a req-5d4247ab-1540-4085-801d-c1994bfc11b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.208 187161 DEBUG oslo_concurrency.lockutils [req-cc96417b-b70c-4483-90a8-b26aee2b7f0a req-5d4247ab-1540-4085-801d-c1994bfc11b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.208 187161 DEBUG oslo_concurrency.lockutils [req-cc96417b-b70c-4483-90a8-b26aee2b7f0a req-5d4247ab-1540-4085-801d-c1994bfc11b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.208 187161 DEBUG oslo_concurrency.lockutils [req-cc96417b-b70c-4483-90a8-b26aee2b7f0a req-5d4247ab-1540-4085-801d-c1994bfc11b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.208 187161 DEBUG nova.compute.manager [req-cc96417b-b70c-4483-90a8-b26aee2b7f0a req-5d4247ab-1540-4085-801d-c1994bfc11b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.208 187161 DEBUG nova.compute.manager [req-cc96417b-b70c-4483-90a8-b26aee2b7f0a req-5d4247ab-1540-4085-801d-c1994bfc11b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:42 compute-1 nova_compute[187157]: 2025-12-02 23:58:42.220 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:43 compute-1 nova_compute[187157]: 2025-12-02 23:58:43.129 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:43 compute-1 nova_compute[187157]: 2025-12-02 23:58:43.640 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:43 compute-1 nova_compute[187157]: 2025-12-02 23:58:43.641 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:44 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:44.204 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:44 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:44.205 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:58:44 compute-1 nova_compute[187157]: 2025-12-02 23:58:44.249 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:44 compute-1 nova_compute[187157]: 2025-12-02 23:58:44.532 187161 DEBUG nova.compute.manager [req-88f781dd-ac8e-4796-9cf5-07a257f4bf4e req-9dbe73a2-18dd-4816-8a2a-de462fd0b3b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-deleted-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:44 compute-1 nova_compute[187157]: 2025-12-02 23:58:44.532 187161 INFO nova.compute.manager [req-88f781dd-ac8e-4796-9cf5-07a257f4bf4e req-9dbe73a2-18dd-4816-8a2a-de462fd0b3b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Neutron deleted interface aa1a4037-7471-48e2-8297-5aeb45672ebb; detaching it from the instance and deleting it from the info cache
Dec 02 23:58:44 compute-1 nova_compute[187157]: 2025-12-02 23:58:44.532 187161 DEBUG nova.network.neutron [req-88f781dd-ac8e-4796-9cf5-07a257f4bf4e req-9dbe73a2-18dd-4816-8a2a-de462fd0b3b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:44 compute-1 nova_compute[187157]: 2025-12-02 23:58:44.798 187161 DEBUG nova.network.neutron [-] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:45 compute-1 nova_compute[187157]: 2025-12-02 23:58:45.039 187161 DEBUG nova.compute.manager [req-88f781dd-ac8e-4796-9cf5-07a257f4bf4e req-9dbe73a2-18dd-4816-8a2a-de462fd0b3b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Detach interface failed, port_id=aa1a4037-7471-48e2-8297-5aeb45672ebb, reason: Instance d66a42a4-6bab-485d-a45f-0df43bf25d1b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 02 23:58:45 compute-1 nova_compute[187157]: 2025-12-02 23:58:45.306 187161 INFO nova.compute.manager [-] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Took 4.10 seconds to deallocate network for instance.
Dec 02 23:58:45 compute-1 nova_compute[187157]: 2025-12-02 23:58:45.690 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:45 compute-1 nova_compute[187157]: 2025-12-02 23:58:45.846 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:45 compute-1 nova_compute[187157]: 2025-12-02 23:58:45.847 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:45 compute-1 nova_compute[187157]: 2025-12-02 23:58:45.929 187161 DEBUG nova.compute.provider_tree [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:46 compute-1 podman[211646]: 2025-12-02 23:58:46.254634212 +0000 UTC m=+0.091420287 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public)
Dec 02 23:58:46 compute-1 nova_compute[187157]: 2025-12-02 23:58:46.437 187161 DEBUG nova.scheduler.client.report [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:46 compute-1 nova_compute[187157]: 2025-12-02 23:58:46.949 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:46 compute-1 nova_compute[187157]: 2025-12-02 23:58:46.971 187161 INFO nova.scheduler.client.report [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Deleted allocations for instance d66a42a4-6bab-485d-a45f-0df43bf25d1b
Dec 02 23:58:47 compute-1 nova_compute[187157]: 2025-12-02 23:58:47.090 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:48 compute-1 nova_compute[187157]: 2025-12-02 23:58:48.006 187161 DEBUG oslo_concurrency.lockutils [None req-2d4080d8-0d02-423f-9f00-fd6eeecfd312 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.640s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: ERROR   23:58:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: ERROR   23:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: ERROR   23:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: ERROR   23:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: ERROR   23:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:58:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:58:49 compute-1 nova_compute[187157]: 2025-12-02 23:58:49.616 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:49 compute-1 nova_compute[187157]: 2025-12-02 23:58:49.617 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:49 compute-1 nova_compute[187157]: 2025-12-02 23:58:49.617 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:49 compute-1 nova_compute[187157]: 2025-12-02 23:58:49.617 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:49 compute-1 nova_compute[187157]: 2025-12-02 23:58:49.617 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:49 compute-1 nova_compute[187157]: 2025-12-02 23:58:49.630 187161 INFO nova.compute.manager [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Terminating instance
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.156 187161 DEBUG nova.compute.manager [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:58:50 compute-1 kernel: tap71da42a2-d9 (unregistering): left promiscuous mode
Dec 02 23:58:50 compute-1 NetworkManager[55553]: <info>  [1764719930.1887] device (tap71da42a2-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:58:50 compute-1 ovn_controller[95464]: 2025-12-02T23:58:50Z|00088|binding|INFO|Releasing lport 71da42a2-d97f-47f6-999c-93e4ef78e6e2 from this chassis (sb_readonly=0)
Dec 02 23:58:50 compute-1 ovn_controller[95464]: 2025-12-02T23:58:50Z|00089|binding|INFO|Setting lport 71da42a2-d97f-47f6-999c-93e4ef78e6e2 down in Southbound
Dec 02 23:58:50 compute-1 ovn_controller[95464]: 2025-12-02T23:58:50Z|00090|binding|INFO|Removing iface tap71da42a2-d9 ovn-installed in OVS
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.192 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.205 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:d6:d0 10.100.0.7'], port_security=['fa:16:3e:e2:d6:d0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2e1c5d01-3310-41d8-8a6d-780b09f6bf06', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=71da42a2-d97f-47f6-999c-93e4ef78e6e2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.207 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 71da42a2-d97f-47f6-999c-93e4ef78e6e2 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.209 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.223 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.240 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ade69c17-5029-4913-a76e-c3e2c6c99fd5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec 02 23:58:50 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 24.111s CPU time.
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.270 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[223fd71c-c747-4bd0-9da1-60acbed4f759]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 systemd-machined[153454]: Machine qemu-2-instance-00000005 terminated.
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.272 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[1f749e15-2fa2-4aec-96c7-bc964d786d7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.296 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[929e0fa4-c435-41b5-a3bf-dc9596828a17]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.314 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ae62932e-7269-4b06-b679-7a3701fe81d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371237, 'reachable_time': 25608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211680, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.331 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9b0627bf-bbe6-444e-a3a8-68e4eaa26a24]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371249, 'tstamp': 371249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211681, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371252, 'tstamp': 371252}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211681, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.332 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.376 187161 DEBUG nova.compute.manager [req-43579581-524e-4d1c-982b-894d845c623c req-78a9284d-b49e-4fd0-9569-8a0daebf6b45 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-unplugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.376 187161 DEBUG oslo_concurrency.lockutils [req-43579581-524e-4d1c-982b-894d845c623c req-78a9284d-b49e-4fd0-9569-8a0daebf6b45 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.376 187161 DEBUG oslo_concurrency.lockutils [req-43579581-524e-4d1c-982b-894d845c623c req-78a9284d-b49e-4fd0-9569-8a0daebf6b45 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.376 187161 DEBUG oslo_concurrency.lockutils [req-43579581-524e-4d1c-982b-894d845c623c req-78a9284d-b49e-4fd0-9569-8a0daebf6b45 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.377 187161 DEBUG nova.compute.manager [req-43579581-524e-4d1c-982b-894d845c623c req-78a9284d-b49e-4fd0-9569-8a0daebf6b45 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] No waiting events found dispatching network-vif-unplugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.377 187161 DEBUG nova.compute.manager [req-43579581-524e-4d1c-982b-894d845c623c req-78a9284d-b49e-4fd0-9569-8a0daebf6b45 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-unplugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.377 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.378 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.385 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.385 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.386 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.386 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.386 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:50 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:50.388 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cae7d824-ed25-422f-8153-fd689de35267]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.441 187161 INFO nova.virt.libvirt.driver [-] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Instance destroyed successfully.
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.441 187161 DEBUG nova.objects.instance [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'resources' on Instance uuid 2e1c5d01-3310-41d8-8a6d-780b09f6bf06 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:50 compute-1 podman[211694]: 2025-12-02 23:58:50.534332037 +0000 UTC m=+0.091796984 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.692 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.949 187161 DEBUG nova.virt.libvirt.vif [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:54:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1874250387',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1874250387',id=5,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:54:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-zhbwq71l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:54:54Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=2e1c5d01-3310-41d8-8a6d-780b09f6bf06,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.949 187161 DEBUG nova.network.os_vif_util [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "address": "fa:16:3e:e2:d6:d0", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71da42a2-d9", "ovs_interfaceid": "71da42a2-d97f-47f6-999c-93e4ef78e6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.951 187161 DEBUG nova.network.os_vif_util [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.951 187161 DEBUG os_vif [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.955 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.955 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71da42a2-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.957 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.959 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.960 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.961 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=da881451-716c-4c52-b838-a3641b35a33a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.962 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.963 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.965 187161 INFO os_vif [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:d6:d0,bridge_name='br-int',has_traffic_filtering=True,id=71da42a2-d97f-47f6-999c-93e4ef78e6e2,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71da42a2-d9')
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.966 187161 INFO nova.virt.libvirt.driver [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Deleting instance files /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06_del
Dec 02 23:58:50 compute-1 nova_compute[187157]: 2025-12-02 23:58:50.968 187161 INFO nova.virt.libvirt.driver [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Deletion of /var/lib/nova/instances/2e1c5d01-3310-41d8-8a6d-780b09f6bf06_del complete
Dec 02 23:58:51 compute-1 nova_compute[187157]: 2025-12-02 23:58:51.487 187161 INFO nova.compute.manager [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Took 1.33 seconds to destroy the instance on the hypervisor.
Dec 02 23:58:51 compute-1 nova_compute[187157]: 2025-12-02 23:58:51.488 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:58:51 compute-1 nova_compute[187157]: 2025-12-02 23:58:51.488 187161 DEBUG nova.compute.manager [-] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:58:51 compute-1 nova_compute[187157]: 2025-12-02 23:58:51.488 187161 DEBUG nova.network.neutron [-] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:58:51 compute-1 nova_compute[187157]: 2025-12-02 23:58:51.489 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.075 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.102 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.459 187161 DEBUG nova.compute.manager [req-bf76d070-c10e-48e2-a648-dfb558ceedab req-a996c50c-ff6b-48b3-bbdb-4c4d8b62cfa3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-unplugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.459 187161 DEBUG oslo_concurrency.lockutils [req-bf76d070-c10e-48e2-a648-dfb558ceedab req-a996c50c-ff6b-48b3-bbdb-4c4d8b62cfa3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.460 187161 DEBUG oslo_concurrency.lockutils [req-bf76d070-c10e-48e2-a648-dfb558ceedab req-a996c50c-ff6b-48b3-bbdb-4c4d8b62cfa3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.460 187161 DEBUG oslo_concurrency.lockutils [req-bf76d070-c10e-48e2-a648-dfb558ceedab req-a996c50c-ff6b-48b3-bbdb-4c4d8b62cfa3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.460 187161 DEBUG nova.compute.manager [req-bf76d070-c10e-48e2-a648-dfb558ceedab req-a996c50c-ff6b-48b3-bbdb-4c4d8b62cfa3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] No waiting events found dispatching network-vif-unplugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:52 compute-1 nova_compute[187157]: 2025-12-02 23:58:52.461 187161 DEBUG nova.compute.manager [req-bf76d070-c10e-48e2-a648-dfb558ceedab req-a996c50c-ff6b-48b3-bbdb-4c4d8b62cfa3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-unplugged-71da42a2-d97f-47f6-999c-93e4ef78e6e2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:53 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:53.206 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:53 compute-1 nova_compute[187157]: 2025-12-02 23:58:53.764 187161 DEBUG nova.network.neutron [-] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:54 compute-1 nova_compute[187157]: 2025-12-02 23:58:54.271 187161 INFO nova.compute.manager [-] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Took 2.78 seconds to deallocate network for instance.
Dec 02 23:58:54 compute-1 nova_compute[187157]: 2025-12-02 23:58:54.579 187161 DEBUG nova.compute.manager [req-694c20f9-993a-4043-bbd8-cced81187e60 req-c162aeac-5cf9-4451-a59c-be960db7f6d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e1c5d01-3310-41d8-8a6d-780b09f6bf06] Received event network-vif-deleted-71da42a2-d97f-47f6-999c-93e4ef78e6e2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:54 compute-1 nova_compute[187157]: 2025-12-02 23:58:54.790 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:54 compute-1 nova_compute[187157]: 2025-12-02 23:58:54.791 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:54 compute-1 nova_compute[187157]: 2025-12-02 23:58:54.891 187161 DEBUG nova.compute.provider_tree [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:55 compute-1 nova_compute[187157]: 2025-12-02 23:58:55.398 187161 DEBUG nova.scheduler.client.report [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:55 compute-1 nova_compute[187157]: 2025-12-02 23:58:55.906 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:55 compute-1 nova_compute[187157]: 2025-12-02 23:58:55.936 187161 INFO nova.scheduler.client.report [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Deleted allocations for instance 2e1c5d01-3310-41d8-8a6d-780b09f6bf06
Dec 02 23:58:55 compute-1 nova_compute[187157]: 2025-12-02 23:58:55.962 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:56 compute-1 nova_compute[187157]: 2025-12-02 23:58:56.976 187161 DEBUG oslo_concurrency.lockutils [None req-85a70304-520a-4bf8-87d4-e08fbe317ef6 d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "2e1c5d01-3310-41d8-8a6d-780b09f6bf06" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.359s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.105 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.718 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.719 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.719 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.719 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.720 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:57 compute-1 nova_compute[187157]: 2025-12-02 23:58:57.735 187161 INFO nova.compute.manager [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Terminating instance
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.253 187161 DEBUG nova.compute.manager [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:58:58 compute-1 kernel: tapfbb4ca60-8a (unregistering): left promiscuous mode
Dec 02 23:58:58 compute-1 NetworkManager[55553]: <info>  [1764719938.2828] device (tapfbb4ca60-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00091|binding|INFO|Releasing lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe from this chassis (sb_readonly=0)
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00092|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe down in Southbound
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00093|binding|INFO|Removing iface tapfbb4ca60-8a ovn-installed in OVS
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.293 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.295 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.303 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:84:51 10.100.0.10'], port_security=['fa:16:3e:f8:84:51 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8ccd45c-e570-4b75-b836-a93e2de1818b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.305 104348 INFO neutron.agent.ovn.metadata.agent [-] Port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.306 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.308 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1aa32e-c541-44cf-a7fb-d16a756ced2b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.308 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a namespace which is not needed anymore
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.319 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 02 23:58:58 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 21.796s CPU time.
Dec 02 23:58:58 compute-1 systemd-machined[153454]: Machine qemu-3-instance-00000004 terminated.
Dec 02 23:58:58 compute-1 podman[211720]: 2025-12-02 23:58:58.421394157 +0000 UTC m=+0.087818750 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.457 187161 DEBUG nova.compute.manager [req-0817752d-1eca-449a-84dd-8c5050e153e1 req-331dd351-d9f5-4711-9249-d600be3518a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.457 187161 DEBUG oslo_concurrency.lockutils [req-0817752d-1eca-449a-84dd-8c5050e153e1 req-331dd351-d9f5-4711-9249-d600be3518a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.458 187161 DEBUG oslo_concurrency.lockutils [req-0817752d-1eca-449a-84dd-8c5050e153e1 req-331dd351-d9f5-4711-9249-d600be3518a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.458 187161 DEBUG oslo_concurrency.lockutils [req-0817752d-1eca-449a-84dd-8c5050e153e1 req-331dd351-d9f5-4711-9249-d600be3518a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.458 187161 DEBUG nova.compute.manager [req-0817752d-1eca-449a-84dd-8c5050e153e1 req-331dd351-d9f5-4711-9249-d600be3518a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.459 187161 DEBUG nova.compute.manager [req-0817752d-1eca-449a-84dd-8c5050e153e1 req-331dd351-d9f5-4711-9249-d600be3518a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:58:58 compute-1 kernel: tapfbb4ca60-8a: entered promiscuous mode
Dec 02 23:58:58 compute-1 systemd-udevd[211740]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:58:58 compute-1 NetworkManager[55553]: <info>  [1764719938.4774] manager: (tapfbb4ca60-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00094|binding|INFO|Claiming lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for this chassis.
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00095|binding|INFO|fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe: Claiming fa:16:3e:f8:84:51 10.100.0.10
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.478 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 podman[211764]: 2025-12-02 23:58:58.481593677 +0000 UTC m=+0.044709040 container kill 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 02 23:58:58 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [NOTICE]   (210082) : haproxy version is 3.0.5-8e879a5
Dec 02 23:58:58 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [NOTICE]   (210082) : path to executable is /usr/sbin/haproxy
Dec 02 23:58:58 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [WARNING]  (210082) : Exiting Master process...
Dec 02 23:58:58 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [ALERT]    (210082) : Current worker (210084) exited with code 143 (Terminated)
Dec 02 23:58:58 compute-1 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[210078]: [WARNING]  (210082) : All workers exited. Exiting... (0)
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.487 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:84:51 10.100.0.10'], port_security=['fa:16:3e:f8:84:51 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8ccd45c-e570-4b75-b836-a93e2de1818b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:58 compute-1 kernel: tapfbb4ca60-8a (unregistering): left promiscuous mode
Dec 02 23:58:58 compute-1 systemd[1]: libpod-564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856.scope: Deactivated successfully.
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: hostname: compute-1
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00096|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe ovn-installed in OVS
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00097|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe up in Southbound
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.497 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00098|binding|INFO|Releasing lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe from this chassis (sb_readonly=1)
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.501 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00099|binding|INFO|Removing iface tapfbb4ca60-8a ovn-installed in OVS
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00100|if_status|INFO|Not setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe down as sb is readonly
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.502 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00101|binding|INFO|Releasing lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe from this chassis (sb_readonly=0)
Dec 02 23:58:58 compute-1 ovn_controller[95464]: 2025-12-02T23:58:58Z|00102|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe down in Southbound
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.512 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:84:51 10.100.0.10'], port_security=['fa:16:3e:f8:84:51 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8ccd45c-e570-4b75-b836-a93e2de1818b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.517 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 virtnodedevd[187454]: ethtool ioctl error on tapfbb4ca60-8a: No such device
Dec 02 23:58:58 compute-1 podman[211783]: 2025-12-02 23:58:58.539902221 +0000 UTC m=+0.037380385 container died 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.542 187161 INFO nova.virt.libvirt.driver [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance destroyed successfully.
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.542 187161 DEBUG nova.objects.instance [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'resources' on Instance uuid d8ccd45c-e570-4b75-b836-a93e2de1818b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:58 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856-userdata-shm.mount: Deactivated successfully.
Dec 02 23:58:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-0b6af4d22daec3c1512af11c35f2d6e36f051643211d608a6c556ba1f66c6ffd-merged.mount: Deactivated successfully.
Dec 02 23:58:58 compute-1 podman[211783]: 2025-12-02 23:58:58.570558153 +0000 UTC m=+0.068036287 container cleanup 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:58:58 compute-1 systemd[1]: libpod-conmon-564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856.scope: Deactivated successfully.
Dec 02 23:58:58 compute-1 podman[211789]: 2025-12-02 23:58:58.590444179 +0000 UTC m=+0.071008549 container remove 564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.598 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a77b2011-a9d6-45c4-ae02-aee593a02176]: (4, ("Tue Dec  2 11:58:58 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a (564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856)\n564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856\nTue Dec  2 11:58:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a (564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856)\n564cd0ce96fd007a32a75eff16d4791ad98cf04aefa529264578e7ced7096856\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.601 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ca866039-af4f-4fd8-8aad-1629d96087c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.602 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.602 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e3cb4f-112e-4d25-9c18-92caccbda641]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.603 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.605 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 kernel: tapec494140-a0: left promiscuous mode
Dec 02 23:58:58 compute-1 nova_compute[187157]: 2025-12-02 23:58:58.620 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.622 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b4508345-e428-4028-8a39-70d0862b2118]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.640 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[11b92273-e7f1-418f-882f-cf8a64ec6226]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.642 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[eba8b53c-50a0-4caa-82d7-c72aa3740a25]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.657 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f22e82-6a51-4083-b48d-e11c9b4e2b3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371229, 'reachable_time': 36480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211838, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.660 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.660 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[1814eac0-ce86-417c-a264-d0b6132c45f4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.661 104348 INFO neutron.agent.ovn.metadata.agent [-] Port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:58 compute-1 systemd[1]: run-netns-ovnmeta\x2dec494140\x2da5f4\x2d4327\x2d8807\x2dd7248b1cdc9a.mount: Deactivated successfully.
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.662 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.662 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[16630a77-8f60-44ba-b578-35567e60159b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.663 104348 INFO neutron.agent.ovn.metadata.agent [-] Port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.663 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:58:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:58:58.664 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ca925934-9ce4-4a6e-b554-57e07e96550d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.051 187161 DEBUG nova.virt.libvirt.vif [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:55:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:55:46Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.052 187161 DEBUG nova.network.os_vif_util [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.053 187161 DEBUG nova.network.os_vif_util [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.054 187161 DEBUG os_vif [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.055 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.056 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbb4ca60-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.086 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.089 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.090 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.090 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=8ba5c139-d4ac-4606-b06f-ebf01f9ed250) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.091 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.092 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.094 187161 INFO os_vif [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a')
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.095 187161 INFO nova.virt.libvirt.driver [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Deleting instance files /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_del
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.101 187161 INFO nova.virt.libvirt.driver [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Deletion of /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_del complete
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.618 187161 INFO nova.compute.manager [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Took 1.36 seconds to destroy the instance on the hypervisor.
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.618 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.619 187161 DEBUG nova.compute.manager [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.619 187161 DEBUG nova.network.neutron [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:58:59 compute-1 nova_compute[187157]: 2025-12-02 23:58:59.619 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.246 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.517 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.517 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.517 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.517 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.517 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.518 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.518 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.518 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.518 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.518 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.519 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.519 187161 WARNING nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state deleting.
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.519 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.519 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.519 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.519 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.520 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.520 187161 WARNING nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state deleting.
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.520 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.520 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.520 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.521 187161 DEBUG oslo_concurrency.lockutils [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.522 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:59:00 compute-1 nova_compute[187157]: 2025-12-02 23:59:00.522 187161 DEBUG nova.compute.manager [req-3cc8fa78-b921-4382-b0d0-c983561f59b3 req-5add2688-abed-4f3f-9d7a-1c2a7cee2ecd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:59:01 compute-1 nova_compute[187157]: 2025-12-02 23:59:01.304 187161 DEBUG nova.compute.manager [req-2aa88623-9d5b-4800-9a17-81cce72f8841 req-3faed775-cded-47d7-8248-40fafd65e7f5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-deleted-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:01 compute-1 nova_compute[187157]: 2025-12-02 23:59:01.304 187161 INFO nova.compute.manager [req-2aa88623-9d5b-4800-9a17-81cce72f8841 req-3faed775-cded-47d7-8248-40fafd65e7f5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Neutron deleted interface fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe; detaching it from the instance and deleting it from the info cache
Dec 02 23:59:01 compute-1 nova_compute[187157]: 2025-12-02 23:59:01.305 187161 DEBUG nova.network.neutron [req-2aa88623-9d5b-4800-9a17-81cce72f8841 req-3faed775-cded-47d7-8248-40fafd65e7f5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:59:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:01.708 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:01.708 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:01 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:01.708 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:01 compute-1 nova_compute[187157]: 2025-12-02 23:59:01.753 187161 DEBUG nova.network.neutron [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:59:01 compute-1 nova_compute[187157]: 2025-12-02 23:59:01.813 187161 DEBUG nova.compute.manager [req-2aa88623-9d5b-4800-9a17-81cce72f8841 req-3faed775-cded-47d7-8248-40fafd65e7f5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Detach interface failed, port_id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe, reason: Instance d8ccd45c-e570-4b75-b836-a93e2de1818b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 02 23:59:02 compute-1 nova_compute[187157]: 2025-12-02 23:59:02.106 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:02 compute-1 nova_compute[187157]: 2025-12-02 23:59:02.262 187161 INFO nova.compute.manager [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Took 2.64 seconds to deallocate network for instance.
Dec 02 23:59:02 compute-1 nova_compute[187157]: 2025-12-02 23:59:02.793 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:02 compute-1 nova_compute[187157]: 2025-12-02 23:59:02.793 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:02 compute-1 nova_compute[187157]: 2025-12-02 23:59:02.851 187161 DEBUG nova.compute.provider_tree [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:59:03 compute-1 podman[211840]: 2025-12-02 23:59:03.251281394 +0000 UTC m=+0.082624185 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:59:03 compute-1 nova_compute[187157]: 2025-12-02 23:59:03.366 187161 DEBUG nova.scheduler.client.report [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:59:03 compute-1 nova_compute[187157]: 2025-12-02 23:59:03.877 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.084s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:03 compute-1 nova_compute[187157]: 2025-12-02 23:59:03.909 187161 INFO nova.scheduler.client.report [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Deleted allocations for instance d8ccd45c-e570-4b75-b836-a93e2de1818b
Dec 02 23:59:04 compute-1 nova_compute[187157]: 2025-12-02 23:59:04.092 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:04 compute-1 nova_compute[187157]: 2025-12-02 23:59:04.942 187161 DEBUG oslo_concurrency.lockutils [None req-3eb85c74-ebf1-48de-8c20-9a64fd9ed61d d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.223s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:05 compute-1 podman[211867]: 2025-12-02 23:59:05.484659307 +0000 UTC m=+0.069371539 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:59:05 compute-1 podman[197537]: time="2025-12-02T23:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:59:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:59:05 compute-1 podman[197537]: @ - - [02/Dec/2025:23:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec 02 23:59:07 compute-1 nova_compute[187157]: 2025-12-02 23:59:07.109 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:09 compute-1 nova_compute[187157]: 2025-12-02 23:59:09.095 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:11 compute-1 nova_compute[187157]: 2025-12-02 23:59:11.459 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:12 compute-1 nova_compute[187157]: 2025-12-02 23:59:12.111 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:14 compute-1 nova_compute[187157]: 2025-12-02 23:59:14.097 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:17 compute-1 nova_compute[187157]: 2025-12-02 23:59:17.113 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:17 compute-1 podman[211888]: 2025-12-02 23:59:17.239400436 +0000 UTC m=+0.082696798 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:59:19 compute-1 nova_compute[187157]: 2025-12-02 23:59:19.099 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: ERROR   23:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: ERROR   23:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: ERROR   23:59:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: ERROR   23:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: ERROR   23:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:59:19 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:59:21 compute-1 podman[211911]: 2025-12-02 23:59:21.249721549 +0000 UTC m=+0.080717729 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:59:22 compute-1 nova_compute[187157]: 2025-12-02 23:59:22.156 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:22 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:22.239 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:31:60 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86bd5114f990455bad9eb03145bbd520', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1e62127c-f508-4e9e-bb5e-b8835c45c013) old=Port_Binding(mac=['fa:16:3e:b1:31:60'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86bd5114f990455bad9eb03145bbd520', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:22 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:22.240 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1e62127c-f508-4e9e-bb5e-b8835c45c013 in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 updated
Dec 02 23:59:22 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:22.242 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c53a3e7-267c-42d7-8662-f773adcc4604, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:59:22 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:22.243 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[438dacb6-dfb3-445f-8dfd-f1f942e04fbf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:24 compute-1 nova_compute[187157]: 2025-12-02 23:59:24.102 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:27 compute-1 nova_compute[187157]: 2025-12-02 23:59:27.157 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:28 compute-1 nova_compute[187157]: 2025-12-02 23:59:28.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:29 compute-1 nova_compute[187157]: 2025-12-02 23:59:29.104 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:29 compute-1 podman[211932]: 2025-12-02 23:59:29.231764671 +0000 UTC m=+0.071481706 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:59:32 compute-1 nova_compute[187157]: 2025-12-02 23:59:32.193 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:34 compute-1 nova_compute[187157]: 2025-12-02 23:59:34.105 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:34 compute-1 podman[211956]: 2025-12-02 23:59:34.248447578 +0000 UTC m=+0.095454813 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Dec 02 23:59:34 compute-1 nova_compute[187157]: 2025-12-02 23:59:34.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:34 compute-1 nova_compute[187157]: 2025-12-02 23:59:34.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:35 compute-1 podman[197537]: time="2025-12-02T23:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:59:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:59:35 compute-1 podman[197537]: @ - - [02/Dec/2025:23:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2610 "" "Go-http-client/1.1"
Dec 02 23:59:36 compute-1 podman[211983]: 2025-12-02 23:59:36.221498685 +0000 UTC m=+0.063533537 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 02 23:59:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:36.352 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:b3:30 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5c8896b-cfca-4a55-9039-47650a4a166a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1a892beb-44d7-43a2-b31f-e3508e05fb34) old=Port_Binding(mac=['fa:16:3e:0f:b3:30'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:36.353 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1a892beb-44d7-43a2-b31f-e3508e05fb34 in datapath bf2d65cb-48b1-4884-a049-c1d6f7a31df9 updated
Dec 02 23:59:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:36.355 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bf2d65cb-48b1-4884-a049-c1d6f7a31df9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:59:36 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:36.356 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b84a29c7-1a83-4948-a1c7-577b2a78baec]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:36 compute-1 nova_compute[187157]: 2025-12-02 23:59:36.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:36 compute-1 nova_compute[187157]: 2025-12-02 23:59:36.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:36 compute-1 nova_compute[187157]: 2025-12-02 23:59:36.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:59:37 compute-1 nova_compute[187157]: 2025-12-02 23:59:37.194 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:37 compute-1 nova_compute[187157]: 2025-12-02 23:59:37.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.215 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.450 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.452 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.483 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.484 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5841MB free_disk=73.16807174682617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.485 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:38 compute-1 nova_compute[187157]: 2025-12-02 23:59:38.486 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:39 compute-1 nova_compute[187157]: 2025-12-02 23:59:39.107 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:39 compute-1 nova_compute[187157]: 2025-12-02 23:59:39.626 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:59:39 compute-1 nova_compute[187157]: 2025-12-02 23:59:39.627 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:59:38 up  1:06,  0 user,  load average: 0.53, 0.39, 0.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:59:39 compute-1 nova_compute[187157]: 2025-12-02 23:59:39.654 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:59:40 compute-1 nova_compute[187157]: 2025-12-02 23:59:40.161 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:59:40 compute-1 nova_compute[187157]: 2025-12-02 23:59:40.672 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:59:40 compute-1 nova_compute[187157]: 2025-12-02 23:59:40.673 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.187s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:42 compute-1 nova_compute[187157]: 2025-12-02 23:59:42.197 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:42 compute-1 nova_compute[187157]: 2025-12-02 23:59:42.673 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:42 compute-1 nova_compute[187157]: 2025-12-02 23:59:42.673 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:44 compute-1 nova_compute[187157]: 2025-12-02 23:59:44.110 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:44 compute-1 ovn_controller[95464]: 2025-12-02T23:59:44Z|00103|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 02 23:59:47 compute-1 nova_compute[187157]: 2025-12-02 23:59:47.242 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:48 compute-1 podman[212004]: 2025-12-02 23:59:48.207506008 +0000 UTC m=+0.054195781 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 02 23:59:49 compute-1 nova_compute[187157]: 2025-12-02 23:59:49.111 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: ERROR   23:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: ERROR   23:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: ERROR   23:59:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: ERROR   23:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: ERROR   23:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:59:49 compute-1 openstack_network_exporter[199685]: 
Dec 02 23:59:51 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 02 23:59:51 compute-1 podman[212026]: 2025-12-02 23:59:51.884163135 +0000 UTC m=+0.057260046 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Dec 02 23:59:52 compute-1 nova_compute[187157]: 2025-12-02 23:59:52.290 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-1 nova_compute[187157]: 2025-12-02 23:59:54.114 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-1 nova_compute[187157]: 2025-12-02 23:59:57.292 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:58.175 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:58 compute-1 ovn_metadata_agent[104343]: 2025-12-02 23:59:58.175 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:59:58 compute-1 nova_compute[187157]: 2025-12-02 23:59:58.176 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:59 compute-1 nova_compute[187157]: 2025-12-02 23:59:59.116 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:00 compute-1 systemd[1]: Starting update of the root trust anchor for DNSSEC validation in unbound...
Dec 03 00:00:00 compute-1 systemd[1]: Starting Rotate log files...
Dec 03 00:00:00 compute-1 systemd[1]: unbound-anchor.service: Deactivated successfully.
Dec 03 00:00:00 compute-1 systemd[1]: Finished update of the root trust anchor for DNSSEC validation in unbound.
Dec 03 00:00:00 compute-1 podman[212048]: 2025-12-03 00:00:00.212631372 +0000 UTC m=+0.053666900 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:00:00 compute-1 systemd[1]: logrotate.service: Deactivated successfully.
Dec 03 00:00:00 compute-1 systemd[1]: Finished Rotate log files.
Dec 03 00:00:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:01.711 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:01.712 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:01.712 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:02 compute-1 nova_compute[187157]: 2025-12-03 00:00:02.292 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:03 compute-1 nova_compute[187157]: 2025-12-03 00:00:03.845 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:03 compute-1 nova_compute[187157]: 2025-12-03 00:00:03.845 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:04 compute-1 nova_compute[187157]: 2025-12-03 00:00:04.118 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:04.176 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:04 compute-1 nova_compute[187157]: 2025-12-03 00:00:04.350 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:00:04 compute-1 nova_compute[187157]: 2025-12-03 00:00:04.913 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:04 compute-1 nova_compute[187157]: 2025-12-03 00:00:04.913 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:04 compute-1 nova_compute[187157]: 2025-12-03 00:00:04.919 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:00:04 compute-1 nova_compute[187157]: 2025-12-03 00:00:04.919 187161 INFO nova.compute.claims [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:00:05 compute-1 podman[212074]: 2025-12-03 00:00:05.307355354 +0000 UTC m=+0.133456384 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:00:05 compute-1 podman[197537]: time="2025-12-03T00:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:00:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:00:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2612 "" "Go-http-client/1.1"
Dec 03 00:00:05 compute-1 nova_compute[187157]: 2025-12-03 00:00:05.978 187161 DEBUG nova.compute.provider_tree [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:00:06 compute-1 nova_compute[187157]: 2025-12-03 00:00:06.487 187161 DEBUG nova.scheduler.client.report [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:00:06 compute-1 nova_compute[187157]: 2025-12-03 00:00:06.997 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.084s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:06 compute-1 nova_compute[187157]: 2025-12-03 00:00:06.998 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:00:07 compute-1 podman[212102]: 2025-12-03 00:00:07.205236207 +0000 UTC m=+0.050308549 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Dec 03 00:00:07 compute-1 nova_compute[187157]: 2025-12-03 00:00:07.307 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:07 compute-1 nova_compute[187157]: 2025-12-03 00:00:07.509 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:00:07 compute-1 nova_compute[187157]: 2025-12-03 00:00:07.509 187161 DEBUG nova.network.neutron [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:00:07 compute-1 nova_compute[187157]: 2025-12-03 00:00:07.510 187161 WARNING neutronclient.v2_0.client [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:07 compute-1 nova_compute[187157]: 2025-12-03 00:00:07.510 187161 WARNING neutronclient.v2_0.client [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:08 compute-1 nova_compute[187157]: 2025-12-03 00:00:08.017 187161 INFO nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:00:08 compute-1 nova_compute[187157]: 2025-12-03 00:00:08.416 187161 DEBUG nova.network.neutron [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Successfully created port: 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:00:08 compute-1 nova_compute[187157]: 2025-12-03 00:00:08.524 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.120 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.124 187161 DEBUG nova.network.neutron [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Successfully updated port: 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.204 187161 DEBUG nova.compute.manager [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-changed-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.205 187161 DEBUG nova.compute.manager [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Refreshing instance network info cache due to event network-changed-4b2c586f-1a7f-4c5d-a6a1-90abac987f19. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.205 187161 DEBUG oslo_concurrency.lockutils [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.206 187161 DEBUG oslo_concurrency.lockutils [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.206 187161 DEBUG nova.network.neutron [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Refreshing network info cache for port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.549 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.552 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.552 187161 INFO nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Creating image(s)
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.554 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.554 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.555 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.556 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.562 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.565 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.639 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.654 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.655 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.655 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.656 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.659 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.660 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.712 187161 WARNING neutronclient.v2_0.client [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.731 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.732 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.762 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.763 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.764 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.811 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.812 187161 DEBUG nova.virt.disk.api [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Checking if we can resize image /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.812 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.899 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.900 187161 DEBUG nova.virt.disk.api [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Cannot resize image /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.900 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.900 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Ensure instance console log exists: /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.901 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.901 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:09 compute-1 nova_compute[187157]: 2025-12-03 00:00:09.901 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:10 compute-1 nova_compute[187157]: 2025-12-03 00:00:10.250 187161 DEBUG nova.network.neutron [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:00:10 compute-1 nova_compute[187157]: 2025-12-03 00:00:10.426 187161 DEBUG nova.network.neutron [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:00:10 compute-1 nova_compute[187157]: 2025-12-03 00:00:10.933 187161 DEBUG oslo_concurrency.lockutils [req-2e45fbe7-05f2-48c8-83ab-6e5d33910bdd req-8a1da070-39f9-4700-813f-79a47b82261e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:00:10 compute-1 nova_compute[187157]: 2025-12-03 00:00:10.934 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquired lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:00:10 compute-1 nova_compute[187157]: 2025-12-03 00:00:10.935 187161 DEBUG nova.network.neutron [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.080 187161 DEBUG nova.network.neutron [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.293 187161 WARNING neutronclient.v2_0.client [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.309 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.455 187161 DEBUG nova.network.neutron [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.964 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Releasing lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.965 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance network_info: |[{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.969 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Start _get_guest_xml network_info=[{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.975 187161 WARNING nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.977 187161 DEBUG nova.virt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-361127533', uuid='00869cbc-c7e6-47b4-8d21-c0ac64fe6381'), owner=OwnerMeta(userid='f68e1c374dfc43b8a8431b13bafb13c8', username='tempest-TestExecuteBasicStrategy-436376556-project-admin', projectid='916fb9304c874baa83b72f5956839b66', projectname='tempest-TestExecuteBasicStrategy-436376556'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720012.977259) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.982 187161 DEBUG nova.virt.libvirt.host [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.982 187161 DEBUG nova.virt.libvirt.host [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.986 187161 DEBUG nova.virt.libvirt.host [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.987 187161 DEBUG nova.virt.libvirt.host [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.989 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.989 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.990 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.990 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.990 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.991 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.991 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.991 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.991 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.992 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.992 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.992 187161 DEBUG nova.virt.hardware [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.997 187161 DEBUG nova.virt.libvirt.vif [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:00:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-361127533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-361127533',id=11,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yzg3uqar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:00:08Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.997 187161 DEBUG nova.network.os_vif_util [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converting VIF {"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.998 187161 DEBUG nova.network.os_vif_util [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:00:12 compute-1 nova_compute[187157]: 2025-12-03 00:00:12.999 187161 DEBUG nova.objects.instance [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lazy-loading 'pci_devices' on Instance uuid 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.511 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <uuid>00869cbc-c7e6-47b4-8d21-c0ac64fe6381</uuid>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <name>instance-0000000b</name>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteBasicStrategy-server-361127533</nova:name>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:00:12</nova:creationTime>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:00:13 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:00:13 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:user uuid="f68e1c374dfc43b8a8431b13bafb13c8">tempest-TestExecuteBasicStrategy-436376556-project-admin</nova:user>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:project uuid="916fb9304c874baa83b72f5956839b66">tempest-TestExecuteBasicStrategy-436376556</nova:project>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         <nova:port uuid="4b2c586f-1a7f-4c5d-a6a1-90abac987f19">
Dec 03 00:00:13 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <system>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <entry name="serial">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <entry name="uuid">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </system>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <os>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </os>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <features>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </features>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:f4:55:7f"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <target dev="tap4b2c586f-1a"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <video>
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </video>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:00:13 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:00:13 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:00:13 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:00:13 compute-1 nova_compute[187157]: </domain>
Dec 03 00:00:13 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.513 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Preparing to wait for external event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.513 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.514 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.514 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.515 187161 DEBUG nova.virt.libvirt.vif [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:00:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-361127533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-361127533',id=11,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yzg3uqar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:00:08Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.516 187161 DEBUG nova.network.os_vif_util [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converting VIF {"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.516 187161 DEBUG nova.network.os_vif_util [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.517 187161 DEBUG os_vif [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.518 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.519 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.519 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.521 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.521 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7b988612-1da0-56ef-a77f-2b45e4c189a3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.564 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.566 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.569 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.570 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b2c586f-1a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.570 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4b2c586f-1a, col_values=(('qos', UUID('ab5a9e69-62dd-46bc-a890-1cd43b16413b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.570 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4b2c586f-1a, col_values=(('external_ids', {'iface-id': '4b2c586f-1a7f-4c5d-a6a1-90abac987f19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:55:7f', 'vm-uuid': '00869cbc-c7e6-47b4-8d21-c0ac64fe6381'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.572 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 NetworkManager[55553]: <info>  [1764720013.5738] manager: (tap4b2c586f-1a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.574 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.583 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:13 compute-1 nova_compute[187157]: 2025-12-03 00:00:13.584 187161 INFO os_vif [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a')
Dec 03 00:00:15 compute-1 nova_compute[187157]: 2025-12-03 00:00:15.128 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:00:15 compute-1 nova_compute[187157]: 2025-12-03 00:00:15.128 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:00:15 compute-1 nova_compute[187157]: 2025-12-03 00:00:15.128 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] No VIF found with MAC fa:16:3e:f4:55:7f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:00:15 compute-1 nova_compute[187157]: 2025-12-03 00:00:15.129 187161 INFO nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Using config drive
Dec 03 00:00:15 compute-1 nova_compute[187157]: 2025-12-03 00:00:15.646 187161 WARNING neutronclient.v2_0.client [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.370 187161 INFO nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Creating config drive at /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.380 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpnuqlle7x execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.525 187161 DEBUG oslo_concurrency.processutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpnuqlle7x" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:16 compute-1 kernel: tap4b2c586f-1a: entered promiscuous mode
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.604 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 NetworkManager[55553]: <info>  [1764720016.6057] manager: (tap4b2c586f-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Dec 03 00:00:16 compute-1 ovn_controller[95464]: 2025-12-03T00:00:16Z|00104|binding|INFO|Claiming lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for this chassis.
Dec 03 00:00:16 compute-1 ovn_controller[95464]: 2025-12-03T00:00:16Z|00105|binding|INFO|4b2c586f-1a7f-4c5d-a6a1-90abac987f19: Claiming fa:16:3e:f4:55:7f 10.100.0.14
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.621 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:55:7f 10.100.0.14'], port_security=['fa:16:3e:f4:55:7f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '00869cbc-c7e6-47b4-8d21-c0ac64fe6381', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1093c49e-a0ca-44ab-a8bd-3c19ec9553c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=4b2c586f-1a7f-4c5d-a6a1-90abac987f19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.623 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 bound to our chassis
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.624 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.636 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[60eb4a40-c5d0-4dcd-ba26-a2992e4d6811]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.637 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1c53a3e7-21 in ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.639 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1c53a3e7-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.639 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5d831f8b-323a-435d-85b8-d6cc0dcc9b39]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.640 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2e5d3b0f-0954-445c-a2aa-2759ac575d5a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 systemd-udevd[212158]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:00:16 compute-1 systemd-machined[153454]: New machine qemu-8-instance-0000000b.
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.654 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[e25783bc-daa8-497c-829a-dd95bf7aaf08]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 NetworkManager[55553]: <info>  [1764720016.6650] device (tap4b2c586f-1a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:00:16 compute-1 NetworkManager[55553]: <info>  [1764720016.6669] device (tap4b2c586f-1a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.688 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[77ad90e8-edfa-4423-bb9c-d8f376805dd7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.692 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.695 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 ovn_controller[95464]: 2025-12-03T00:00:16Z|00106|binding|INFO|Setting lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 ovn-installed in OVS
Dec 03 00:00:16 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Dec 03 00:00:16 compute-1 ovn_controller[95464]: 2025-12-03T00:00:16Z|00107|binding|INFO|Setting lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 up in Southbound
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.699 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.717 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[25ca87d9-e943-4bb5-a2e1-193a60824e41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.722 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c84ea9db-8ade-4a5c-aa19-0ab331e9f3cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 NetworkManager[55553]: <info>  [1764720016.7244] manager: (tap1c53a3e7-20): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.756 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[6279334e-256e-4f1f-80e4-0dda826fcd3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.759 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[38e94b12-cd65-498c-9c80-b2ae38744884]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 NetworkManager[55553]: <info>  [1764720016.7860] device (tap1c53a3e7-20): carrier: link connected
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.791 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d010d6f3-cd8c-41a2-af8a-cac85aa1aee9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.807 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[422e697f-3df9-43ec-bf40-1de4c86bf131]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c53a3e7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:31:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403644, 'reachable_time': 22315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212189, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.821 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[62beb515-915f-4041-9915-276164c21131]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:3160'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403644, 'tstamp': 403644}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212191, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.835 187161 DEBUG nova.compute.manager [req-e844fb41-cf8d-45ec-a184-be7f4a861855 req-cf2c9973-bda0-411e-ae57-5fa6d75341cf 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.835 187161 DEBUG oslo_concurrency.lockutils [req-e844fb41-cf8d-45ec-a184-be7f4a861855 req-cf2c9973-bda0-411e-ae57-5fa6d75341cf 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.835 187161 DEBUG oslo_concurrency.lockutils [req-e844fb41-cf8d-45ec-a184-be7f4a861855 req-cf2c9973-bda0-411e-ae57-5fa6d75341cf 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.835 187161 DEBUG oslo_concurrency.lockutils [req-e844fb41-cf8d-45ec-a184-be7f4a861855 req-cf2c9973-bda0-411e-ae57-5fa6d75341cf 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.836 187161 DEBUG nova.compute.manager [req-e844fb41-cf8d-45ec-a184-be7f4a861855 req-cf2c9973-bda0-411e-ae57-5fa6d75341cf 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Processing event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.838 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d29a3e4b-6489-45f6-9a50-5e99ef9c9547]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c53a3e7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:31:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403644, 'reachable_time': 22315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212192, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.874 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[38f409bd-f41e-421c-8bda-36446d8eee79]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.928 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d5f721-5023-4d56-bbf3-465595426089]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.929 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c53a3e7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.930 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.930 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c53a3e7-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:16 compute-1 kernel: tap1c53a3e7-20: entered promiscuous mode
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.931 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 NetworkManager[55553]: <info>  [1764720016.9321] manager: (tap1c53a3e7-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.932 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.933 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1c53a3e7-20, col_values=(('external_ids', {'iface-id': '1e62127c-f508-4e9e-bb5e-b8835c45c013'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:16 compute-1 ovn_controller[95464]: 2025-12-03T00:00:16Z|00108|binding|INFO|Releasing lport 1e62127c-f508-4e9e-bb5e-b8835c45c013 from this chassis (sb_readonly=0)
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.945 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.946 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4901c057-73b2-4a2d-beba-d75cef68166d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.947 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.947 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.947 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1c53a3e7-267c-42d7-8662-f773adcc4604 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.947 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.948 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9fff20b8-d3f1-41a1-90fa-bf2b3c9af9c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.948 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.948 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7dbf44-2b32-4f7d-9ca7-98cdfca2248f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.949 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID 1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:00:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:00:16.949 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'env', 'PROCESS_TAG=haproxy-1c53a3e7-267c-42d7-8662-f773adcc4604', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1c53a3e7-267c-42d7-8662-f773adcc4604.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.966 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.975 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.978 187161 INFO nova.virt.libvirt.driver [-] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance spawned successfully.
Dec 03 00:00:16 compute-1 nova_compute[187157]: 2025-12-03 00:00:16.979 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.311 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:17 compute-1 podman[212231]: 2025-12-03 00:00:17.358561615 +0000 UTC m=+0.061306142 container create 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Dec 03 00:00:17 compute-1 systemd[1]: Started libpod-conmon-771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d.scope.
Dec 03 00:00:17 compute-1 podman[212231]: 2025-12-03 00:00:17.326797763 +0000 UTC m=+0.029542330 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:00:17 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:00:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/859c80a6014e96c4c7bb15f58e785dd90cbf40413a1f72173efea561dfc964de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:00:17 compute-1 podman[212231]: 2025-12-03 00:00:17.456355462 +0000 UTC m=+0.159100039 container init 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:00:17 compute-1 podman[212231]: 2025-12-03 00:00:17.46293079 +0000 UTC m=+0.165675317 container start 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:00:17 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [NOTICE]   (212250) : New worker (212252) forked
Dec 03 00:00:17 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [NOTICE]   (212250) : Loading success.
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.492 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.492 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.493 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.493 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.493 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:00:17 compute-1 nova_compute[187157]: 2025-12-03 00:00:17.494 187161 DEBUG nova.virt.libvirt.driver [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.002 187161 INFO nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Took 8.45 seconds to spawn the instance on the hypervisor.
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.002 187161 DEBUG nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.541 187161 INFO nova.compute.manager [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Took 13.68 seconds to build instance.
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.574 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.899 187161 DEBUG nova.compute.manager [req-7cfcae23-58d8-42bb-9cf8-76efb135139e req-0e726a5a-ff79-4521-ab77-1b283b5b03be 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.900 187161 DEBUG oslo_concurrency.lockutils [req-7cfcae23-58d8-42bb-9cf8-76efb135139e req-0e726a5a-ff79-4521-ab77-1b283b5b03be 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.901 187161 DEBUG oslo_concurrency.lockutils [req-7cfcae23-58d8-42bb-9cf8-76efb135139e req-0e726a5a-ff79-4521-ab77-1b283b5b03be 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.902 187161 DEBUG oslo_concurrency.lockutils [req-7cfcae23-58d8-42bb-9cf8-76efb135139e req-0e726a5a-ff79-4521-ab77-1b283b5b03be 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.903 187161 DEBUG nova.compute.manager [req-7cfcae23-58d8-42bb-9cf8-76efb135139e req-0e726a5a-ff79-4521-ab77-1b283b5b03be 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:00:18 compute-1 nova_compute[187157]: 2025-12-03 00:00:18.903 187161 WARNING nova.compute.manager [req-7cfcae23-58d8-42bb-9cf8-76efb135139e req-0e726a5a-ff79-4521-ab77-1b283b5b03be 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received unexpected event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with vm_state active and task_state None.
Dec 03 00:00:19 compute-1 nova_compute[187157]: 2025-12-03 00:00:19.051 187161 DEBUG oslo_concurrency.lockutils [None req-63f8b6d4-efcc-4216-afea-62b8a61ef08b f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.206s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:19 compute-1 podman[212261]: 2025-12-03 00:00:19.242488202 +0000 UTC m=+0.082066801 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: ERROR   00:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: ERROR   00:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: ERROR   00:00:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: ERROR   00:00:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: ERROR   00:00:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:00:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:00:22 compute-1 podman[212283]: 2025-12-03 00:00:22.229159707 +0000 UTC m=+0.064626702 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:00:22 compute-1 nova_compute[187157]: 2025-12-03 00:00:22.313 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:23 compute-1 nova_compute[187157]: 2025-12-03 00:00:23.578 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:27 compute-1 nova_compute[187157]: 2025-12-03 00:00:27.316 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:28 compute-1 nova_compute[187157]: 2025-12-03 00:00:28.582 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:28 compute-1 nova_compute[187157]: 2025-12-03 00:00:28.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:30 compute-1 ovn_controller[95464]: 2025-12-03T00:00:30Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:55:7f 10.100.0.14
Dec 03 00:00:30 compute-1 ovn_controller[95464]: 2025-12-03T00:00:30Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:55:7f 10.100.0.14
Dec 03 00:00:31 compute-1 podman[212315]: 2025-12-03 00:00:31.272516114 +0000 UTC m=+0.086561889 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:00:32 compute-1 nova_compute[187157]: 2025-12-03 00:00:32.318 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:33 compute-1 nova_compute[187157]: 2025-12-03 00:00:33.585 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:35 compute-1 podman[197537]: time="2025-12-03T00:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:00:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:00:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3073 "" "Go-http-client/1.1"
Dec 03 00:00:35 compute-1 nova_compute[187157]: 2025-12-03 00:00:35.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:35 compute-1 podman[212340]: 2025-12-03 00:00:35.812956922 +0000 UTC m=+0.106778054 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:00:36 compute-1 nova_compute[187157]: 2025-12-03 00:00:36.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:37 compute-1 nova_compute[187157]: 2025-12-03 00:00:37.321 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:37 compute-1 nova_compute[187157]: 2025-12-03 00:00:37.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:37 compute-1 nova_compute[187157]: 2025-12-03 00:00:37.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:00:38 compute-1 podman[212366]: 2025-12-03 00:00:38.236943862 +0000 UTC m=+0.074765245 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 03 00:00:38 compute-1 nova_compute[187157]: 2025-12-03 00:00:38.589 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:38 compute-1 nova_compute[187157]: 2025-12-03 00:00:38.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:38 compute-1 nova_compute[187157]: 2025-12-03 00:00:38.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:39 compute-1 nova_compute[187157]: 2025-12-03 00:00:39.224 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:39 compute-1 nova_compute[187157]: 2025-12-03 00:00:39.224 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:39 compute-1 nova_compute[187157]: 2025-12-03 00:00:39.225 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:39 compute-1 nova_compute[187157]: 2025-12-03 00:00:39.225 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.302 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.356 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.357 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.414 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.557 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.558 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.597 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.597 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5668MB free_disk=73.13897705078125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.597 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:40 compute-1 nova_compute[187157]: 2025-12-03 00:00:40.598 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:41 compute-1 nova_compute[187157]: 2025-12-03 00:00:41.655 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:00:41 compute-1 nova_compute[187157]: 2025-12-03 00:00:41.656 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:00:41 compute-1 nova_compute[187157]: 2025-12-03 00:00:41.656 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:00:40 up  1:07,  0 user,  load average: 0.47, 0.39, 0.39\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_916fb9304c874baa83b72f5956839b66': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:00:41 compute-1 nova_compute[187157]: 2025-12-03 00:00:41.700 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:00:42 compute-1 nova_compute[187157]: 2025-12-03 00:00:42.210 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:00:42 compute-1 nova_compute[187157]: 2025-12-03 00:00:42.326 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:42 compute-1 nova_compute[187157]: 2025-12-03 00:00:42.722 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:00:42 compute-1 nova_compute[187157]: 2025-12-03 00:00:42.722 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:43 compute-1 nova_compute[187157]: 2025-12-03 00:00:43.593 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:43 compute-1 nova_compute[187157]: 2025-12-03 00:00:43.822 187161 DEBUG nova.compute.manager [None req-1a17b9a6-c747-4b94-b000-fe7c73215c1a 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Dec 03 00:00:43 compute-1 nova_compute[187157]: 2025-12-03 00:00:43.869 187161 DEBUG nova.compute.provider_tree [None req-1a17b9a6-c747-4b94-b000-fe7c73215c1a 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 generation from 10 to 12 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:00:44 compute-1 nova_compute[187157]: 2025-12-03 00:00:44.722 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:45 compute-1 nova_compute[187157]: 2025-12-03 00:00:45.234 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:45 compute-1 nova_compute[187157]: 2025-12-03 00:00:45.234 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:46 compute-1 ovn_controller[95464]: 2025-12-03T00:00:46Z|00109|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Dec 03 00:00:47 compute-1 nova_compute[187157]: 2025-12-03 00:00:47.327 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:48 compute-1 nova_compute[187157]: 2025-12-03 00:00:48.596 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: ERROR   00:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: ERROR   00:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: ERROR   00:00:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: ERROR   00:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: ERROR   00:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:00:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:00:50 compute-1 podman[212393]: 2025-12-03 00:00:50.234314553 +0000 UTC m=+0.072679205 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible)
Dec 03 00:00:52 compute-1 nova_compute[187157]: 2025-12-03 00:00:52.261 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Check if temp file /var/lib/nova/instances/tmp3pj4_prd exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:00:52 compute-1 nova_compute[187157]: 2025-12-03 00:00:52.266 187161 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00869cbc-c7e6-47b4-8d21-c0ac64fe6381',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:00:52 compute-1 nova_compute[187157]: 2025-12-03 00:00:52.328 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:53 compute-1 podman[212415]: 2025-12-03 00:00:53.214890662 +0000 UTC m=+0.057132832 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:00:53 compute-1 nova_compute[187157]: 2025-12-03 00:00:53.599 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:55 compute-1 sshd-session[212435]: Invalid user admin from 185.156.73.233 port 19756
Dec 03 00:00:55 compute-1 sshd-session[212435]: Connection closed by invalid user admin 185.156.73.233 port 19756 [preauth]
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.657 187161 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.709 187161 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.710 187161 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.762 187161 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.764 187161 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Preparing to wait for external event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.764 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.764 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:56 compute-1 nova_compute[187157]: 2025-12-03 00:00:56.765 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:57 compute-1 nova_compute[187157]: 2025-12-03 00:00:57.331 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:58 compute-1 nova_compute[187157]: 2025-12-03 00:00:58.602 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:01.713 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:01.713 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:01.714 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:01 compute-1 CROND[212445]: (root) CMD (run-parts /etc/cron.hourly)
Dec 03 00:01:01 compute-1 run-parts[212448]: (/etc/cron.hourly) starting 0anacron
Dec 03 00:01:01 compute-1 anacron[212456]: Anacron started on 2025-12-03
Dec 03 00:01:01 compute-1 anacron[212456]: Job `cron.monthly' locked by another anacron - skipping
Dec 03 00:01:01 compute-1 anacron[212456]: Normal exit (0 jobs run)
Dec 03 00:01:01 compute-1 run-parts[212458]: (/etc/cron.hourly) finished 0anacron
Dec 03 00:01:01 compute-1 CROND[212444]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 03 00:01:02 compute-1 podman[212459]: 2025-12-03 00:01:02.030663303 +0000 UTC m=+0.059382997 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.334 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:02.525 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:02.526 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.526 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.565 187161 DEBUG nova.compute.manager [req-75962d2c-b2a4-41fa-afe7-bc19737a6cf8 req-05e0ef46-11e3-435d-a89a-213809dfd243 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.566 187161 DEBUG oslo_concurrency.lockutils [req-75962d2c-b2a4-41fa-afe7-bc19737a6cf8 req-05e0ef46-11e3-435d-a89a-213809dfd243 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.566 187161 DEBUG oslo_concurrency.lockutils [req-75962d2c-b2a4-41fa-afe7-bc19737a6cf8 req-05e0ef46-11e3-435d-a89a-213809dfd243 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.566 187161 DEBUG oslo_concurrency.lockutils [req-75962d2c-b2a4-41fa-afe7-bc19737a6cf8 req-05e0ef46-11e3-435d-a89a-213809dfd243 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.567 187161 DEBUG nova.compute.manager [req-75962d2c-b2a4-41fa-afe7-bc19737a6cf8 req-05e0ef46-11e3-435d-a89a-213809dfd243 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No event matching network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 in dict_keys([('network-vif-plugged', '4b2c586f-1a7f-4c5d-a6a1-90abac987f19')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:01:02 compute-1 nova_compute[187157]: 2025-12-03 00:01:02.567 187161 DEBUG nova.compute.manager [req-75962d2c-b2a4-41fa-afe7-bc19737a6cf8 req-05e0ef46-11e3-435d-a89a-213809dfd243 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:03 compute-1 nova_compute[187157]: 2025-12-03 00:01:03.606 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.305 187161 INFO nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Took 7.54 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.807 187161 DEBUG nova.compute.manager [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.808 187161 DEBUG oslo_concurrency.lockutils [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.808 187161 DEBUG oslo_concurrency.lockutils [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.808 187161 DEBUG oslo_concurrency.lockutils [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.808 187161 DEBUG nova.compute.manager [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Processing event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.809 187161 DEBUG nova.compute.manager [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-changed-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.809 187161 DEBUG nova.compute.manager [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Refreshing instance network info cache due to event network-changed-4b2c586f-1a7f-4c5d-a6a1-90abac987f19. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.809 187161 DEBUG oslo_concurrency.lockutils [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.809 187161 DEBUG oslo_concurrency.lockutils [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.809 187161 DEBUG nova.network.neutron [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Refreshing network info cache for port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:01:04 compute-1 nova_compute[187157]: 2025-12-03 00:01:04.810 187161 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:01:05 compute-1 nova_compute[187157]: 2025-12-03 00:01:05.317 187161 WARNING neutronclient.v2_0.client [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:05 compute-1 nova_compute[187157]: 2025-12-03 00:01:05.321 187161 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00869cbc-c7e6-47b4-8d21-c0ac64fe6381',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1587aff6-582d-4e00-88aa-d179269eff0f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:01:05 compute-1 podman[197537]: time="2025-12-03T00:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:01:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:01:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3081 "" "Go-http-client/1.1"
Dec 03 00:01:05 compute-1 nova_compute[187157]: 2025-12-03 00:01:05.847 187161 DEBUG nova.objects.instance [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:01:05 compute-1 nova_compute[187157]: 2025-12-03 00:01:05.848 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:01:05 compute-1 nova_compute[187157]: 2025-12-03 00:01:05.850 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:01:05 compute-1 nova_compute[187157]: 2025-12-03 00:01:05.850 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:01:06 compute-1 podman[212485]: 2025-12-03 00:01:06.294056453 +0000 UTC m=+0.126992509 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.352 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.353 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.367 187161 DEBUG nova.virt.libvirt.vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:00:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-361127533',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-361127533',id=11,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:00:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yzg3uqar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:00:18Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.367 187161 DEBUG nova.network.os_vif_util [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.368 187161 DEBUG nova.network.os_vif_util [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.369 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <mac address="fa:16:3e:f4:55:7f"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <model type="virtio"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <mtu size="1442"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <target dev="tap4b2c586f-1a"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]: </interface>
Dec 03 00:01:06 compute-1 nova_compute[187157]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.369 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <name>instance-0000000b</name>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <uuid>00869cbc-c7e6-47b4-8d21-c0ac64fe6381</uuid>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteBasicStrategy-server-361127533</nova:name>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:00:12</nova:creationTime>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:user uuid="f68e1c374dfc43b8a8431b13bafb13c8">tempest-TestExecuteBasicStrategy-436376556-project-admin</nova:user>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:project uuid="916fb9304c874baa83b72f5956839b66">tempest-TestExecuteBasicStrategy-436376556</nova:project>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:port uuid="4b2c586f-1a7f-4c5d-a6a1-90abac987f19">
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <memory unit="KiB">131072</memory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <vcpu placement="static">1</vcpu>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <resource>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <partition>/machine</partition>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </resource>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <system>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="serial">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="uuid">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </system>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <os>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </os>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <features>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <vmcoreinfo state="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </features>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <model fallback="allow">Nehalem</model>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_reboot>restart</on_reboot>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_crash>destroy</on_crash>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <readonly/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="1" port="0x10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="2" port="0x11"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="3" port="0x12"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="4" port="0x13"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="5" port="0x14"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="6" port="0x15"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="7" port="0x16"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="8" port="0x17"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="9" port="0x18"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="10" port="0x19"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="11" port="0x1a"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="12" port="0x1b"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="13" port="0x1c"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="14" port="0x1d"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="15" port="0x1e"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="16" port="0x1f"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="17" port="0x20"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="18" port="0x21"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="19" port="0x22"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="20" port="0x23"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="21" port="0x24"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="22" port="0x25"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="23" port="0x26"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="24" port="0x27"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="25" port="0x28"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-pci-bridge"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="sata" index="0">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <interface type="ethernet"><mac address="fa:16:3e:f4:55:7f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4b2c586f-1a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </interface><serial type="pty">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target type="isa-serial" port="0">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <model name="isa-serial"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </target>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <console type="pty">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target type="serial" port="0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </console>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </input>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <input type="mouse" bus="ps2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <listen type="address" address="::"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </graphics>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <video>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </video>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]: </domain>
Dec 03 00:01:06 compute-1 nova_compute[187157]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.369 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <name>instance-0000000b</name>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <uuid>00869cbc-c7e6-47b4-8d21-c0ac64fe6381</uuid>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteBasicStrategy-server-361127533</nova:name>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:00:12</nova:creationTime>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:user uuid="f68e1c374dfc43b8a8431b13bafb13c8">tempest-TestExecuteBasicStrategy-436376556-project-admin</nova:user>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:project uuid="916fb9304c874baa83b72f5956839b66">tempest-TestExecuteBasicStrategy-436376556</nova:project>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:port uuid="4b2c586f-1a7f-4c5d-a6a1-90abac987f19">
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <memory unit="KiB">131072</memory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <vcpu placement="static">1</vcpu>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <resource>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <partition>/machine</partition>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </resource>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <system>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="serial">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="uuid">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </system>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <os>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </os>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <features>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <vmcoreinfo state="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </features>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <model fallback="allow">Nehalem</model>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_reboot>restart</on_reboot>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_crash>destroy</on_crash>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <readonly/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="1" port="0x10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="2" port="0x11"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="3" port="0x12"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="4" port="0x13"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="5" port="0x14"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="6" port="0x15"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="7" port="0x16"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="8" port="0x17"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="9" port="0x18"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="10" port="0x19"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="11" port="0x1a"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="12" port="0x1b"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="13" port="0x1c"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="14" port="0x1d"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="15" port="0x1e"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="16" port="0x1f"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="17" port="0x20"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="18" port="0x21"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="19" port="0x22"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="20" port="0x23"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="21" port="0x24"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="22" port="0x25"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="23" port="0x26"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="24" port="0x27"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="25" port="0x28"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-pci-bridge"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="sata" index="0">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <interface type="ethernet"><mac address="fa:16:3e:f4:55:7f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4b2c586f-1a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </interface><serial type="pty">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target type="isa-serial" port="0">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <model name="isa-serial"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </target>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <console type="pty">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target type="serial" port="0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </console>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </input>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <input type="mouse" bus="ps2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <listen type="address" address="::"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </graphics>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <video>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </video>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]: </domain>
Dec 03 00:01:06 compute-1 nova_compute[187157]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.370 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <name>instance-0000000b</name>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <uuid>00869cbc-c7e6-47b4-8d21-c0ac64fe6381</uuid>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteBasicStrategy-server-361127533</nova:name>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:00:12</nova:creationTime>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:user uuid="f68e1c374dfc43b8a8431b13bafb13c8">tempest-TestExecuteBasicStrategy-436376556-project-admin</nova:user>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:project uuid="916fb9304c874baa83b72f5956839b66">tempest-TestExecuteBasicStrategy-436376556</nova:project>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <nova:port uuid="4b2c586f-1a7f-4c5d-a6a1-90abac987f19">
Dec 03 00:01:06 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <memory unit="KiB">131072</memory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <vcpu placement="static">1</vcpu>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <resource>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <partition>/machine</partition>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </resource>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <system>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="serial">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="uuid">00869cbc-c7e6-47b4-8d21-c0ac64fe6381</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </system>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <os>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </os>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <features>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <vmcoreinfo state="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </features>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <model fallback="allow">Nehalem</model>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_reboot>restart</on_reboot>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <on_crash>destroy</on_crash>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <readonly/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="1" port="0x10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="2" port="0x11"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="3" port="0x12"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="4" port="0x13"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="5" port="0x14"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="6" port="0x15"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="7" port="0x16"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="8" port="0x17"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="9" port="0x18"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="10" port="0x19"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="11" port="0x1a"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="12" port="0x1b"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="13" port="0x1c"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="14" port="0x1d"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="15" port="0x1e"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="16" port="0x1f"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="17" port="0x20"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="18" port="0x21"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="19" port="0x22"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="20" port="0x23"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="21" port="0x24"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="22" port="0x25"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="23" port="0x26"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="24" port="0x27"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-root-port"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target chassis="25" port="0x28"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model name="pcie-pci-bridge"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <controller type="sata" index="0">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </controller>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <interface type="ethernet"><mac address="fa:16:3e:f4:55:7f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4b2c586f-1a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </interface><serial type="pty">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target type="isa-serial" port="0">
Dec 03 00:01:06 compute-1 nova_compute[187157]:         <model name="isa-serial"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       </target>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <console type="pty">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/console.log" append="off"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <target type="serial" port="0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </console>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </input>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <input type="mouse" bus="ps2"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <listen type="address" address="::"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </graphics>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <video>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </video>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:01:06 compute-1 nova_compute[187157]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:01:06 compute-1 nova_compute[187157]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:01:06 compute-1 nova_compute[187157]: </domain>
Dec 03 00:01:06 compute-1 nova_compute[187157]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.370 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.854 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:01:06 compute-1 nova_compute[187157]: 2025-12-03 00:01:06.855 187161 INFO nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:01:07 compute-1 nova_compute[187157]: 2025-12-03 00:01:07.380 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:07 compute-1 nova_compute[187157]: 2025-12-03 00:01:07.507 187161 WARNING neutronclient.v2_0.client [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.071 187161 DEBUG nova.network.neutron [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updated VIF entry in instance network info cache for port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.071 187161 DEBUG nova.network.neutron [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.103 187161 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.577 187161 DEBUG oslo_concurrency.lockutils [req-d2b122be-ce4f-4025-8e34-107f90d405b8 req-4d54c9f8-2bc3-43a5-888d-421522bb831e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.607 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.608 187161 DEBUG nova.virt.libvirt.migration [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.608 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:08 compute-1 kernel: tap4b2c586f-1a (unregistering): left promiscuous mode
Dec 03 00:01:08 compute-1 NetworkManager[55553]: <info>  [1764720068.9117] device (tap4b2c586f-1a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:01:08 compute-1 ovn_controller[95464]: 2025-12-03T00:01:08Z|00110|binding|INFO|Releasing lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 from this chassis (sb_readonly=0)
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.918 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:08 compute-1 ovn_controller[95464]: 2025-12-03T00:01:08Z|00111|binding|INFO|Setting lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 down in Southbound
Dec 03 00:01:08 compute-1 ovn_controller[95464]: 2025-12-03T00:01:08Z|00112|binding|INFO|Removing iface tap4b2c586f-1a ovn-installed in OVS
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.921 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:08 compute-1 nova_compute[187157]: 2025-12-03 00:01:08.943 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:08 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 03 00:01:08 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 15.070s CPU time.
Dec 03 00:01:08 compute-1 systemd-machined[153454]: Machine qemu-8-instance-0000000b terminated.
Dec 03 00:01:08 compute-1 podman[212525]: 2025-12-03 00:01:08.993581546 +0000 UTC m=+0.057628674 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:01:09 compute-1 ovn_controller[95464]: 2025-12-03T00:01:09Z|00113|binding|INFO|Releasing lport 1e62127c-f508-4e9e-bb5e-b8835c45c013 from this chassis (sb_readonly=0)
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.028 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:55:7f 10.100.0.14'], port_security=['fa:16:3e:f4:55:7f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '83290d9e-bd8f-4c21-b54d-356f7c3da39f'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '00869cbc-c7e6-47b4-8d21-c0ac64fe6381', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1093c49e-a0ca-44ab-a8bd-3c19ec9553c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=4b2c586f-1a7f-4c5d-a6a1-90abac987f19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.029 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 unbound from our chassis
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.030 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c53a3e7-267c-42d7-8662-f773adcc4604, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.031 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[120c63eb-9348-4741-b169-f80554ffa850]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.031 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 namespace which is not needed anymore
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.074 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.110 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.115 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 podman[212567]: 2025-12-03 00:01:09.13747696 +0000 UTC m=+0.029350956 container kill 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:01:09 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [NOTICE]   (212250) : haproxy version is 3.0.5-8e879a5
Dec 03 00:01:09 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [NOTICE]   (212250) : path to executable is /usr/sbin/haproxy
Dec 03 00:01:09 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [WARNING]  (212250) : Exiting Master process...
Dec 03 00:01:09 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [ALERT]    (212250) : Current worker (212252) exited with code 143 (Terminated)
Dec 03 00:01:09 compute-1 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[212246]: [WARNING]  (212250) : All workers exited. Exiting... (0)
Dec 03 00:01:09 compute-1 systemd[1]: libpod-771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d.scope: Deactivated successfully.
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.154 187161 DEBUG nova.virt.libvirt.guest [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.155 187161 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migration operation has completed
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.155 187161 INFO nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] _post_live_migration() is started..
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.158 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.158 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.159 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.167 187161 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.168 187161 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:09 compute-1 podman[212596]: 2025-12-03 00:01:09.178899083 +0000 UTC m=+0.021902946 container died 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:01:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d-userdata-shm.mount: Deactivated successfully.
Dec 03 00:01:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-859c80a6014e96c4c7bb15f58e785dd90cbf40413a1f72173efea561dfc964de-merged.mount: Deactivated successfully.
Dec 03 00:01:09 compute-1 podman[212596]: 2025-12-03 00:01:09.346871545 +0000 UTC m=+0.189875388 container cleanup 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Dec 03 00:01:09 compute-1 systemd[1]: libpod-conmon-771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d.scope: Deactivated successfully.
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.527 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:09 compute-1 podman[212599]: 2025-12-03 00:01:09.605733208 +0000 UTC m=+0.440884373 container remove 771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.607 187161 DEBUG nova.compute.manager [req-1522e3c0-0e89-4483-900f-55079f308242 req-98576e4b-d2e9-4cae-adde-5a1a3394ae50 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.607 187161 DEBUG oslo_concurrency.lockutils [req-1522e3c0-0e89-4483-900f-55079f308242 req-98576e4b-d2e9-4cae-adde-5a1a3394ae50 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.607 187161 DEBUG oslo_concurrency.lockutils [req-1522e3c0-0e89-4483-900f-55079f308242 req-98576e4b-d2e9-4cae-adde-5a1a3394ae50 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.607 187161 DEBUG oslo_concurrency.lockutils [req-1522e3c0-0e89-4483-900f-55079f308242 req-98576e4b-d2e9-4cae-adde-5a1a3394ae50 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.608 187161 DEBUG nova.compute.manager [req-1522e3c0-0e89-4483-900f-55079f308242 req-98576e4b-d2e9-4cae-adde-5a1a3394ae50 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.608 187161 DEBUG nova.compute.manager [req-1522e3c0-0e89-4483-900f-55079f308242 req-98576e4b-d2e9-4cae-adde-5a1a3394ae50 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.610 187161 DEBUG nova.compute.manager [req-f077d2e4-1865-4603-aee0-ccfd730be5f6 req-f3230e9b-b081-4899-bb9b-91224acbd8fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.610 187161 DEBUG oslo_concurrency.lockutils [req-f077d2e4-1865-4603-aee0-ccfd730be5f6 req-f3230e9b-b081-4899-bb9b-91224acbd8fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.610 187161 DEBUG oslo_concurrency.lockutils [req-f077d2e4-1865-4603-aee0-ccfd730be5f6 req-f3230e9b-b081-4899-bb9b-91224acbd8fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.610 187161 DEBUG oslo_concurrency.lockutils [req-f077d2e4-1865-4603-aee0-ccfd730be5f6 req-f3230e9b-b081-4899-bb9b-91224acbd8fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.610 187161 DEBUG nova.compute.manager [req-f077d2e4-1865-4603-aee0-ccfd730be5f6 req-f3230e9b-b081-4899-bb9b-91224acbd8fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.611 187161 DEBUG nova.compute.manager [req-f077d2e4-1865-4603-aee0-ccfd730be5f6 req-f3230e9b-b081-4899-bb9b-91224acbd8fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.611 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8fd1c6-c1f8-4f49-9e52-e291ed85e8ba]: (4, ("Wed Dec  3 12:01:09 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 (771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d)\n771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d\nWed Dec  3 12:01:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 (771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d)\n771ef0682407d081d12a222dd6e47e0e563827a6bd1ef01a80d49708cd01eb6d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.612 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5ad46b-f923-4e54-9ef0-73f0c856457c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.612 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.613 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[50177252-dc45-47d8-bedb-c08c51f28a04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.613 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c53a3e7-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.648 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 kernel: tap1c53a3e7-20: left promiscuous mode
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.663 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.665 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[defe4829-d663-429f-ae63-42cf8050d565]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.675 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[80e4a67e-b7be-45a3-b5f5-21d352d33e7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.675 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ca284b3b-e762-40bf-8919-37289b089646]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.688 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1085f3d1-adb6-4fdb-9f45-9dc28851d690]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403637, 'reachable_time': 22137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212632, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 systemd[1]: run-netns-ovnmeta\x2d1c53a3e7\x2d267c\x2d42d7\x2d8662\x2df773adcc4604.mount: Deactivated successfully.
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.692 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:01:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:09.692 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[415ce385-4a26-4ede-93c3-df2a7dde309b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.819 187161 DEBUG nova.network.neutron [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.820 187161 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.820 187161 DEBUG nova.virt.libvirt.vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:00:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-361127533',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-361127533',id=11,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:00:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yzg3uqar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:00:47Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.821 187161 DEBUG nova.network.os_vif_util [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.821 187161 DEBUG nova.network.os_vif_util [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.822 187161 DEBUG os_vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.824 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.824 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2c586f-1a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.825 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.827 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.827 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.827 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ab5a9e69-62dd-46bc-a890-1cd43b16413b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.828 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.829 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.831 187161 INFO os_vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a')
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.831 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.832 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.832 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.832 187161 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.833 187161 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Deleting instance files /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381_del
Dec 03 00:01:09 compute-1 nova_compute[187157]: 2025-12-03 00:01:09.833 187161 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Deletion of /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381_del complete
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.691 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.691 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.691 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.691 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 WARNING nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received unexpected event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with vm_state active and task_state migrating.
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.692 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 DEBUG oslo_concurrency.lockutils [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 DEBUG nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:11 compute-1 nova_compute[187157]: 2025-12-03 00:01:11.693 187161 WARNING nova.compute.manager [req-be552bbc-d913-4612-adaa-4337eb51c5ba req-ec118c45-7614-42cb-b655-dbb90dd9ae40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received unexpected event network-vif-plugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with vm_state active and task_state migrating.
Dec 03 00:01:12 compute-1 nova_compute[187157]: 2025-12-03 00:01:12.382 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:14 compute-1 nova_compute[187157]: 2025-12-03 00:01:14.828 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:17 compute-1 nova_compute[187157]: 2025-12-03 00:01:17.384 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.400 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.401 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.401 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.919 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.920 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.920 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:18 compute-1 nova_compute[187157]: 2025-12-03 00:01:18.920 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.046 187161 WARNING nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.048 187161 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.070 187161 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.071 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5845MB free_disk=73.1682243347168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.071 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.071 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: ERROR   00:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: ERROR   00:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: ERROR   00:01:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: ERROR   00:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: ERROR   00:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:01:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:01:19 compute-1 nova_compute[187157]: 2025-12-03 00:01:19.829 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-1 nova_compute[187157]: 2025-12-03 00:01:20.087 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:01:20 compute-1 nova_compute[187157]: 2025-12-03 00:01:20.597 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:01:20 compute-1 nova_compute[187157]: 2025-12-03 00:01:20.634 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 1587aff6-582d-4e00-88aa-d179269eff0f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:01:20 compute-1 nova_compute[187157]: 2025-12-03 00:01:20.634 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:01:20 compute-1 nova_compute[187157]: 2025-12-03 00:01:20.635 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:01:19 up  1:08,  0 user,  load average: 0.31, 0.35, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:01:20 compute-1 nova_compute[187157]: 2025-12-03 00:01:20.678 187161 DEBUG nova.compute.provider_tree [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:01:21 compute-1 nova_compute[187157]: 2025-12-03 00:01:21.185 187161 DEBUG nova.scheduler.client.report [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:01:21 compute-1 podman[212635]: 2025-12-03 00:01:21.2191546 +0000 UTC m=+0.057125192 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 03 00:01:21 compute-1 nova_compute[187157]: 2025-12-03 00:01:21.698 187161 DEBUG nova.compute.resource_tracker [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:01:21 compute-1 nova_compute[187157]: 2025-12-03 00:01:21.698 187161 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.627s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:21 compute-1 nova_compute[187157]: 2025-12-03 00:01:21.716 187161 INFO nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Dec 03 00:01:22 compute-1 nova_compute[187157]: 2025-12-03 00:01:22.385 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:22 compute-1 nova_compute[187157]: 2025-12-03 00:01:22.785 187161 INFO nova.scheduler.client.report [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 1587aff6-582d-4e00-88aa-d179269eff0f
Dec 03 00:01:22 compute-1 nova_compute[187157]: 2025-12-03 00:01:22.786 187161 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:01:23 compute-1 sshd-session[212656]: Invalid user sol from 193.32.162.146 port 51448
Dec 03 00:01:23 compute-1 podman[212658]: 2025-12-03 00:01:23.495156287 +0000 UTC m=+0.071261901 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:01:23 compute-1 sshd-session[212656]: Connection closed by invalid user sol 193.32.162.146 port 51448 [preauth]
Dec 03 00:01:24 compute-1 nova_compute[187157]: 2025-12-03 00:01:24.831 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-1 nova_compute[187157]: 2025-12-03 00:01:27.390 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:28 compute-1 nova_compute[187157]: 2025-12-03 00:01:28.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:29 compute-1 nova_compute[187157]: 2025-12-03 00:01:29.834 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:32 compute-1 podman[212679]: 2025-12-03 00:01:32.223316416 +0000 UTC m=+0.055186635 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:01:32 compute-1 nova_compute[187157]: 2025-12-03 00:01:32.391 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:34 compute-1 nova_compute[187157]: 2025-12-03 00:01:34.424 187161 DEBUG nova.compute.manager [None req-2559e6ae-45cd-4b17-bd1d-6cbba93c6a43 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Dec 03 00:01:34 compute-1 nova_compute[187157]: 2025-12-03 00:01:34.489 187161 DEBUG nova.compute.provider_tree [None req-2559e6ae-45cd-4b17-bd1d-6cbba93c6a43 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Updating resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 generation from 12 to 15 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:01:34 compute-1 nova_compute[187157]: 2025-12-03 00:01:34.836 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:35 compute-1 podman[197537]: time="2025-12-03T00:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:01:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:01:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2611 "" "Go-http-client/1.1"
Dec 03 00:01:36 compute-1 nova_compute[187157]: 2025-12-03 00:01:36.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:37 compute-1 podman[212702]: 2025-12-03 00:01:37.270786123 +0000 UTC m=+0.102185853 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 03 00:01:37 compute-1 nova_compute[187157]: 2025-12-03 00:01:37.393 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:37 compute-1 nova_compute[187157]: 2025-12-03 00:01:37.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:38 compute-1 nova_compute[187157]: 2025-12-03 00:01:38.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:38 compute-1 nova_compute[187157]: 2025-12-03 00:01:38.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:38 compute-1 nova_compute[187157]: 2025-12-03 00:01:38.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:01:38 compute-1 nova_compute[187157]: 2025-12-03 00:01:38.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:39 compute-1 podman[212729]: 2025-12-03 00:01:39.204903715 +0000 UTC m=+0.047948221 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.211 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.334 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.335 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.353 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.353 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5845MB free_disk=73.1682243347168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.354 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.354 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:39 compute-1 nova_compute[187157]: 2025-12-03 00:01:39.837 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:40 compute-1 nova_compute[187157]: 2025-12-03 00:01:40.399 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:01:40 compute-1 nova_compute[187157]: 2025-12-03 00:01:40.399 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:01:39 up  1:08,  0 user,  load average: 0.28, 0.34, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:01:40 compute-1 nova_compute[187157]: 2025-12-03 00:01:40.417 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:01:40 compute-1 nova_compute[187157]: 2025-12-03 00:01:40.939 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:01:41 compute-1 nova_compute[187157]: 2025-12-03 00:01:41.449 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:01:41 compute-1 nova_compute[187157]: 2025-12-03 00:01:41.452 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:42 compute-1 nova_compute[187157]: 2025-12-03 00:01:42.395 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:44 compute-1 nova_compute[187157]: 2025-12-03 00:01:44.454 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:44 compute-1 nova_compute[187157]: 2025-12-03 00:01:44.454 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:44 compute-1 nova_compute[187157]: 2025-12-03 00:01:44.839 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:47 compute-1 nova_compute[187157]: 2025-12-03 00:01:47.397 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:48.037 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:78:55 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f290453794d4fa8afe33607b761758b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0316d60-1ac4-42e4-998e-71514e598331, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a28c2153-d1ad-49de-a81a-3dcc04970435) old=Port_Binding(mac=['fa:16:3e:f4:78:55'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f290453794d4fa8afe33607b761758b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:48.038 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a28c2153-d1ad-49de-a81a-3dcc04970435 in datapath 745be26c-0cf1-4daa-aa35-3c721fbf4717 updated
Dec 03 00:01:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:48.038 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 745be26c-0cf1-4daa-aa35-3c721fbf4717, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:01:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:48.039 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4efac37c-db7a-40ca-8329-54ad1c7fcab5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: ERROR   00:01:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: ERROR   00:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: ERROR   00:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: ERROR   00:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: ERROR   00:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:01:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:01:49 compute-1 nova_compute[187157]: 2025-12-03 00:01:49.841 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:52 compute-1 podman[212749]: 2025-12-03 00:01:52.245338728 +0000 UTC m=+0.079341366 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:01:52 compute-1 nova_compute[187157]: 2025-12-03 00:01:52.398 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:54 compute-1 podman[212770]: 2025-12-03 00:01:54.248889547 +0000 UTC m=+0.080038142 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS)
Dec 03 00:01:54 compute-1 nova_compute[187157]: 2025-12-03 00:01:54.843 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:55.759 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:75:d4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3dee725c2ed74441890102c62cd79f8e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8d40e54-333e-46e7-95bd-2ddce91fa81d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fcfdc77d-119e-456c-8cca-ff658d4ac68e) old=Port_Binding(mac=['fa:16:3e:46:75:d4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3dee725c2ed74441890102c62cd79f8e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:55.760 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fcfdc77d-119e-456c-8cca-ff658d4ac68e in datapath 33af606c-dc30-4673-b583-9dcb920ad7fd updated
Dec 03 00:01:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:55.761 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33af606c-dc30-4673-b583-9dcb920ad7fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:01:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:01:55.762 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bba2aa53-63c7-46f6-be79-43e51b363091]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:57 compute-1 nova_compute[187157]: 2025-12-03 00:01:57.402 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:59 compute-1 nova_compute[187157]: 2025-12-03 00:01:59.844 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:01.715 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:01.715 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:01.715 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:02 compute-1 nova_compute[187157]: 2025-12-03 00:02:02.402 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:03 compute-1 podman[212792]: 2025-12-03 00:02:03.211680679 +0000 UTC m=+0.055608795 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:02:04 compute-1 nova_compute[187157]: 2025-12-03 00:02:04.845 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:05 compute-1 podman[197537]: time="2025-12-03T00:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:02:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:02:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2608 "" "Go-http-client/1.1"
Dec 03 00:02:07 compute-1 nova_compute[187157]: 2025-12-03 00:02:07.403 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:08 compute-1 podman[212819]: 2025-12-03 00:02:08.227936988 +0000 UTC m=+0.070630467 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:02:08 compute-1 ovn_controller[95464]: 2025-12-03T00:02:08Z|00114|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:02:09 compute-1 nova_compute[187157]: 2025-12-03 00:02:09.846 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:10 compute-1 podman[212845]: 2025-12-03 00:02:10.263812523 +0000 UTC m=+0.098759032 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 03 00:02:12 compute-1 nova_compute[187157]: 2025-12-03 00:02:12.405 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:14.228 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:02:14 compute-1 nova_compute[187157]: 2025-12-03 00:02:14.229 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:14.230 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:02:14 compute-1 nova_compute[187157]: 2025-12-03 00:02:14.848 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:17 compute-1 nova_compute[187157]: 2025-12-03 00:02:17.406 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: ERROR   00:02:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: ERROR   00:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: ERROR   00:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: ERROR   00:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: ERROR   00:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:02:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:02:19 compute-1 nova_compute[187157]: 2025-12-03 00:02:19.851 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:22.231 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:22 compute-1 nova_compute[187157]: 2025-12-03 00:02:22.406 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:23 compute-1 podman[212865]: 2025-12-03 00:02:23.22368294 +0000 UTC m=+0.061380135 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:02:24 compute-1 nova_compute[187157]: 2025-12-03 00:02:24.852 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:25 compute-1 podman[212887]: 2025-12-03 00:02:25.214126094 +0000 UTC m=+0.061094857 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 03 00:02:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:26.776 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:bb:c6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '722828099f1644218029b73eaf67d6b4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=add6ea4f-8836-4bed-8f1e-39e943ccf4b5) old=Port_Binding(mac=['fa:16:3e:6d:bb:c6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '722828099f1644218029b73eaf67d6b4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:02:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:26.777 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port add6ea4f-8836-4bed-8f1e-39e943ccf4b5 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb updated
Dec 03 00:02:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:26.778 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed11b71b-745b-4f0c-9f09-37d53d166bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:02:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:26.779 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fb22a1b2-4b64-4560-9f93-486354455e4f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:02:27 compute-1 nova_compute[187157]: 2025-12-03 00:02:27.409 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:27 compute-1 nova_compute[187157]: 2025-12-03 00:02:27.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:27 compute-1 nova_compute[187157]: 2025-12-03 00:02:27.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:02:29 compute-1 nova_compute[187157]: 2025-12-03 00:02:29.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:29 compute-1 nova_compute[187157]: 2025-12-03 00:02:29.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:29 compute-1 nova_compute[187157]: 2025-12-03 00:02:29.854 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:32 compute-1 nova_compute[187157]: 2025-12-03 00:02:32.410 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:34 compute-1 podman[212907]: 2025-12-03 00:02:34.219732425 +0000 UTC m=+0.058905445 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:02:34 compute-1 nova_compute[187157]: 2025-12-03 00:02:34.856 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:35 compute-1 podman[197537]: time="2025-12-03T00:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:02:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:02:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2616 "" "Go-http-client/1.1"
Dec 03 00:02:37 compute-1 nova_compute[187157]: 2025-12-03 00:02:37.207 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:37 compute-1 nova_compute[187157]: 2025-12-03 00:02:37.411 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:37 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:37.507 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:a1:44 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=60db874a-6799-4e4b-b253-d9de0d5108a2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fcadb6ca-198d-432c-87c8-ff78b1242ce1) old=Port_Binding(mac=['fa:16:3e:a3:a1:44'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:02:37 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:37.508 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fcadb6ca-198d-432c-87c8-ff78b1242ce1 in datapath 25ca5bbc-e54a-44c7-ba31-d417797e7df1 updated
Dec 03 00:02:37 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:37.509 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25ca5bbc-e54a-44c7-ba31-d417797e7df1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:02:37 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:02:37.509 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5d79bb45-f42b-4fde-bacc-9e80bbfdf6ca]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:02:38 compute-1 nova_compute[187157]: 2025-12-03 00:02:38.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.218 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.219 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.219 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.219 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:02:39 compute-1 podman[212932]: 2025-12-03 00:02:39.249447956 +0000 UTC m=+0.098142156 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.361 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.363 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.385 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.386 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5829MB free_disk=73.16814804077148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.386 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.387 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:39 compute-1 nova_compute[187157]: 2025-12-03 00:02:39.858 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:40 compute-1 nova_compute[187157]: 2025-12-03 00:02:40.444 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:02:40 compute-1 nova_compute[187157]: 2025-12-03 00:02:40.445 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:02:39 up  1:09,  0 user,  load average: 0.16, 0.29, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:02:40 compute-1 nova_compute[187157]: 2025-12-03 00:02:40.473 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:02:40 compute-1 nova_compute[187157]: 2025-12-03 00:02:40.982 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:02:41 compute-1 podman[212959]: 2025-12-03 00:02:41.241732624 +0000 UTC m=+0.068085435 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:02:41 compute-1 nova_compute[187157]: 2025-12-03 00:02:41.494 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:02:41 compute-1 nova_compute[187157]: 2025-12-03 00:02:41.494 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:41 compute-1 nova_compute[187157]: 2025-12-03 00:02:41.495 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:41 compute-1 nova_compute[187157]: 2025-12-03 00:02:41.495 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:02:42 compute-1 nova_compute[187157]: 2025-12-03 00:02:42.002 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:02:42 compute-1 nova_compute[187157]: 2025-12-03 00:02:42.413 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:42 compute-1 nova_compute[187157]: 2025-12-03 00:02:42.997 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:42 compute-1 nova_compute[187157]: 2025-12-03 00:02:42.998 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:42 compute-1 nova_compute[187157]: 2025-12-03 00:02:42.998 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:42 compute-1 nova_compute[187157]: 2025-12-03 00:02:42.998 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:02:43 compute-1 nova_compute[187157]: 2025-12-03 00:02:43.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:44 compute-1 nova_compute[187157]: 2025-12-03 00:02:44.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:44 compute-1 nova_compute[187157]: 2025-12-03 00:02:44.860 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:47 compute-1 nova_compute[187157]: 2025-12-03 00:02:47.415 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:48 compute-1 nova_compute[187157]: 2025-12-03 00:02:48.697 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: ERROR   00:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: ERROR   00:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: ERROR   00:02:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: ERROR   00:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: ERROR   00:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:02:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:02:49 compute-1 nova_compute[187157]: 2025-12-03 00:02:49.862 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:52 compute-1 nova_compute[187157]: 2025-12-03 00:02:52.416 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:54 compute-1 podman[212978]: 2025-12-03 00:02:54.20437547 +0000 UTC m=+0.050909052 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible)
Dec 03 00:02:54 compute-1 nova_compute[187157]: 2025-12-03 00:02:54.864 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:56 compute-1 podman[212999]: 2025-12-03 00:02:56.207907978 +0000 UTC m=+0.054981260 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4)
Dec 03 00:02:57 compute-1 nova_compute[187157]: 2025-12-03 00:02:57.419 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-1 nova_compute[187157]: 2025-12-03 00:02:59.866 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:00 compute-1 nova_compute[187157]: 2025-12-03 00:03:00.478 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:01.716 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:01.717 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:01.717 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:01 compute-1 anacron[7493]: Job `cron.monthly' started
Dec 03 00:03:01 compute-1 anacron[7493]: Job `cron.monthly' terminated
Dec 03 00:03:01 compute-1 anacron[7493]: Normal exit (3 jobs run)
Dec 03 00:03:02 compute-1 nova_compute[187157]: 2025-12-03 00:03:02.423 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:04 compute-1 nova_compute[187157]: 2025-12-03 00:03:04.869 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:05 compute-1 podman[213022]: 2025-12-03 00:03:05.216034568 +0000 UTC m=+0.059194592 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:03:05 compute-1 podman[197537]: time="2025-12-03T00:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:03:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:03:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2610 "" "Go-http-client/1.1"
Dec 03 00:03:07 compute-1 nova_compute[187157]: 2025-12-03 00:03:07.423 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:07 compute-1 nova_compute[187157]: 2025-12-03 00:03:07.979 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:07 compute-1 nova_compute[187157]: 2025-12-03 00:03:07.979 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:08 compute-1 nova_compute[187157]: 2025-12-03 00:03:08.484 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:03:09 compute-1 nova_compute[187157]: 2025-12-03 00:03:09.033 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:09 compute-1 nova_compute[187157]: 2025-12-03 00:03:09.034 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:09 compute-1 nova_compute[187157]: 2025-12-03 00:03:09.041 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:03:09 compute-1 nova_compute[187157]: 2025-12-03 00:03:09.041 187161 INFO nova.compute.claims [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:03:09 compute-1 nova_compute[187157]: 2025-12-03 00:03:09.871 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:10 compute-1 nova_compute[187157]: 2025-12-03 00:03:10.101 187161 DEBUG nova.compute.provider_tree [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:03:10 compute-1 podman[213047]: 2025-12-03 00:03:10.254606602 +0000 UTC m=+0.090430281 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:03:10 compute-1 nova_compute[187157]: 2025-12-03 00:03:10.608 187161 DEBUG nova.scheduler.client.report [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:03:11 compute-1 nova_compute[187157]: 2025-12-03 00:03:11.130 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:11 compute-1 nova_compute[187157]: 2025-12-03 00:03:11.131 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:03:11 compute-1 nova_compute[187157]: 2025-12-03 00:03:11.647 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:03:11 compute-1 nova_compute[187157]: 2025-12-03 00:03:11.647 187161 DEBUG nova.network.neutron [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:03:11 compute-1 nova_compute[187157]: 2025-12-03 00:03:11.647 187161 WARNING neutronclient.v2_0.client [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:11 compute-1 nova_compute[187157]: 2025-12-03 00:03:11.648 187161 WARNING neutronclient.v2_0.client [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:12 compute-1 nova_compute[187157]: 2025-12-03 00:03:12.157 187161 INFO nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:03:12 compute-1 podman[213074]: 2025-12-03 00:03:12.202058634 +0000 UTC m=+0.049014747 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 03 00:03:12 compute-1 nova_compute[187157]: 2025-12-03 00:03:12.426 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:12 compute-1 nova_compute[187157]: 2025-12-03 00:03:12.666 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.558 187161 DEBUG nova.network.neutron [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Successfully created port: 751b874c-d73f-40e4-8f78-0ec9fe6bd11f _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.682 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.683 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.684 187161 INFO nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Creating image(s)
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.684 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "/var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.684 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.685 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.686 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.689 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.691 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.741 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.742 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.742 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.743 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.745 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.746 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.803 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.804 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.839 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.840 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.841 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.893 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.894 187161 DEBUG nova.virt.disk.api [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Checking if we can resize image /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.894 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.953 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.954 187161 DEBUG nova.virt.disk.api [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Cannot resize image /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.955 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.955 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Ensure instance console log exists: /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.955 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.956 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:13 compute-1 nova_compute[187157]: 2025-12-03 00:03:13.956 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:14.516 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.517 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:14.517 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.814 187161 DEBUG nova.network.neutron [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Successfully updated port: 751b874c-d73f-40e4-8f78-0ec9fe6bd11f _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.872 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.897 187161 DEBUG nova.compute.manager [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-changed-751b874c-d73f-40e4-8f78-0ec9fe6bd11f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.897 187161 DEBUG nova.compute.manager [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Refreshing instance network info cache due to event network-changed-751b874c-d73f-40e4-8f78-0ec9fe6bd11f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.898 187161 DEBUG oslo_concurrency.lockutils [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-219f599c-28c7-4f88-b738-36849b54aeb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.898 187161 DEBUG oslo_concurrency.lockutils [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-219f599c-28c7-4f88-b738-36849b54aeb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:03:14 compute-1 nova_compute[187157]: 2025-12-03 00:03:14.898 187161 DEBUG nova.network.neutron [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Refreshing network info cache for port 751b874c-d73f-40e4-8f78-0ec9fe6bd11f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:03:15 compute-1 nova_compute[187157]: 2025-12-03 00:03:15.321 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "refresh_cache-219f599c-28c7-4f88-b738-36849b54aeb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:03:15 compute-1 nova_compute[187157]: 2025-12-03 00:03:15.405 187161 WARNING neutronclient.v2_0.client [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:15 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:15.520 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:16 compute-1 nova_compute[187157]: 2025-12-03 00:03:16.436 187161 DEBUG nova.network.neutron [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:03:16 compute-1 nova_compute[187157]: 2025-12-03 00:03:16.738 187161 DEBUG nova.network.neutron [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:03:17 compute-1 nova_compute[187157]: 2025-12-03 00:03:17.245 187161 DEBUG oslo_concurrency.lockutils [req-6ce0aca8-a89b-4133-be11-fa0211dcee23 req-bec7afab-0e39-40a0-895d-a2f62cea80bc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-219f599c-28c7-4f88-b738-36849b54aeb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:03:17 compute-1 nova_compute[187157]: 2025-12-03 00:03:17.246 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquired lock "refresh_cache-219f599c-28c7-4f88-b738-36849b54aeb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:03:17 compute-1 nova_compute[187157]: 2025-12-03 00:03:17.247 187161 DEBUG nova.network.neutron [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:03:17 compute-1 nova_compute[187157]: 2025-12-03 00:03:17.427 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:17 compute-1 nova_compute[187157]: 2025-12-03 00:03:17.925 187161 DEBUG nova.network.neutron [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.124 187161 WARNING neutronclient.v2_0.client [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.287 187161 DEBUG nova.network.neutron [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Updating instance_info_cache with network_info: [{"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.793 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Releasing lock "refresh_cache-219f599c-28c7-4f88-b738-36849b54aeb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.794 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Instance network_info: |[{"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.798 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Start _get_guest_xml network_info=[{"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.803 187161 WARNING nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.805 187161 DEBUG nova.virt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-739861395', uuid='219f599c-28c7-4f88-b738-36849b54aeb4'), owner=OwnerMeta(userid='ab182b4a69794d1fa103fbd3d503df99', username='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin', projectid='85e2f91a92cf4b5a9d626e8418f17322', projectname='tempest-TestExecuteHostMaintenanceStrategy-1767783627'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720198.8057172) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.812 187161 DEBUG nova.virt.libvirt.host [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.813 187161 DEBUG nova.virt.libvirt.host [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.819 187161 DEBUG nova.virt.libvirt.host [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.819 187161 DEBUG nova.virt.libvirt.host [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.821 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.822 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.822 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.823 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.823 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.824 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.824 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.824 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.825 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.825 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.825 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.826 187161 DEBUG nova.virt.hardware [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.830 187161 DEBUG nova.virt.libvirt.vif [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:03:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-739861395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-739861395',id=13,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3y1rpwrk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:03:12Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=219f599c-28c7-4f88-b738-36849b54aeb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.831 187161 DEBUG nova.network.os_vif_util [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.831 187161 DEBUG nova.network.os_vif_util [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:03:18 compute-1 nova_compute[187157]: 2025-12-03 00:03:18.832 187161 DEBUG nova.objects.instance [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'pci_devices' on Instance uuid 219f599c-28c7-4f88-b738-36849b54aeb4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.340 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <uuid>219f599c-28c7-4f88-b738-36849b54aeb4</uuid>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <name>instance-0000000d</name>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-739861395</nova:name>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:03:18</nova:creationTime>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:03:19 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:03:19 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         <nova:port uuid="751b874c-d73f-40e4-8f78-0ec9fe6bd11f">
Dec 03 00:03:19 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <system>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <entry name="serial">219f599c-28c7-4f88-b738-36849b54aeb4</entry>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <entry name="uuid">219f599c-28c7-4f88-b738-36849b54aeb4</entry>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </system>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <os>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </os>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <features>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </features>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.config"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:2d:54:f4"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <target dev="tap751b874c-d7"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/console.log" append="off"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <video>
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </video>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:03:19 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:03:19 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:03:19 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:03:19 compute-1 nova_compute[187157]: </domain>
Dec 03 00:03:19 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.342 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Preparing to wait for external event network-vif-plugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.343 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.343 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.343 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.344 187161 DEBUG nova.virt.libvirt.vif [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:03:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-739861395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-739861395',id=13,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3y1rpwrk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:03:12Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=219f599c-28c7-4f88-b738-36849b54aeb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.344 187161 DEBUG nova.network.os_vif_util [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.345 187161 DEBUG nova.network.os_vif_util [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.345 187161 DEBUG os_vif [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.346 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.347 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.347 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.348 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.348 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6b51b0c6-9009-5026-91a3-fa2353f36b9a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.349 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.352 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.357 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.357 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap751b874c-d7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.357 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap751b874c-d7, col_values=(('qos', UUID('dd35b88a-5362-41ad-831f-90edef306941')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.358 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap751b874c-d7, col_values=(('external_ids', {'iface-id': '751b874c-d73f-40e4-8f78-0ec9fe6bd11f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:54:f4', 'vm-uuid': '219f599c-28c7-4f88-b738-36849b54aeb4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.359 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:19 compute-1 NetworkManager[55553]: <info>  [1764720199.3610] manager: (tap751b874c-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.361 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.368 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:19 compute-1 nova_compute[187157]: 2025-12-03 00:03:19.368 187161 INFO os_vif [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7')
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: ERROR   00:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: ERROR   00:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: ERROR   00:03:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: ERROR   00:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: ERROR   00:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:03:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:03:20 compute-1 nova_compute[187157]: 2025-12-03 00:03:20.938 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:03:20 compute-1 nova_compute[187157]: 2025-12-03 00:03:20.939 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:03:20 compute-1 nova_compute[187157]: 2025-12-03 00:03:20.939 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No VIF found with MAC fa:16:3e:2d:54:f4, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:03:20 compute-1 nova_compute[187157]: 2025-12-03 00:03:20.940 187161 INFO nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Using config drive
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.451 187161 WARNING neutronclient.v2_0.client [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.667 187161 INFO nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Creating config drive at /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.config
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.672 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6wtg3zs4 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.798 187161 DEBUG oslo_concurrency.processutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6wtg3zs4" returned: 0 in 0.126s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:21 compute-1 kernel: tap751b874c-d7: entered promiscuous mode
Dec 03 00:03:21 compute-1 NetworkManager[55553]: <info>  [1764720201.8772] manager: (tap751b874c-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec 03 00:03:21 compute-1 systemd-udevd[213128]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:03:21 compute-1 ovn_controller[95464]: 2025-12-03T00:03:21Z|00115|binding|INFO|Claiming lport 751b874c-d73f-40e4-8f78-0ec9fe6bd11f for this chassis.
Dec 03 00:03:21 compute-1 ovn_controller[95464]: 2025-12-03T00:03:21Z|00116|binding|INFO|751b874c-d73f-40e4-8f78-0ec9fe6bd11f: Claiming fa:16:3e:2d:54:f4 10.100.0.13
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.909 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.918 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:21 compute-1 NetworkManager[55553]: <info>  [1764720201.9200] device (tap751b874c-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:03:21 compute-1 NetworkManager[55553]: <info>  [1764720201.9213] device (tap751b874c-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.934 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:54:f4 10.100.0.13'], port_security=['fa:16:3e:2d:54:f4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '219f599c-28c7-4f88-b738-36849b54aeb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=751b874c-d73f-40e4-8f78-0ec9fe6bd11f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.935 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 751b874c-d73f-40e4-8f78-0ec9fe6bd11f in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb bound to our chassis
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.936 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:21 compute-1 systemd-machined[153454]: New machine qemu-9-instance-0000000d.
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.955 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b9be04-e456-410d-bb16-401c68036d71]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.955 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped11b71b-71 in ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.958 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped11b71b-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.958 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9e76d7-5bfc-4795-a387-2879ff4b8b07]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.959 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[790d2f61-0169-4ad9-981a-d873af62ecbc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:21 compute-1 ovn_controller[95464]: 2025-12-03T00:03:21Z|00117|binding|INFO|Setting lport 751b874c-d73f-40e4-8f78-0ec9fe6bd11f ovn-installed in OVS
Dec 03 00:03:21 compute-1 ovn_controller[95464]: 2025-12-03T00:03:21Z|00118|binding|INFO|Setting lport 751b874c-d73f-40e4-8f78-0ec9fe6bd11f up in Southbound
Dec 03 00:03:21 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Dec 03 00:03:21 compute-1 nova_compute[187157]: 2025-12-03 00:03:21.968 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.977 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[244cd7b3-6d0f-46ee-bfe2-86f0ea46b0e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:21.997 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dd4089-447e-4d95-a305-ad0ddfbe6c2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.047 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b4fa3f-2337-4689-8842-4ce6b9bbb435]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.056 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[574b7e3c-7551-4f6d-88b0-c7b499f85672]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 NetworkManager[55553]: <info>  [1764720202.0577] manager: (taped11b71b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.100 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[aede7768-6b57-43b4-a3a3-1b9a531a57ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.103 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[40495b3c-a696-474b-a4dd-77a884345210]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 NetworkManager[55553]: <info>  [1764720202.1325] device (taped11b71b-70): carrier: link connected
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.145 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[57354525-801d-465a-b8de-6fe0d8ade86a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.161 187161 DEBUG nova.compute.manager [req-d5fec42f-d2d8-4f1b-93a7-ee0f941c3457 req-0b9769db-ad13-48ed-87ff-8c1f72a2c2a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-plugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.161 187161 DEBUG oslo_concurrency.lockutils [req-d5fec42f-d2d8-4f1b-93a7-ee0f941c3457 req-0b9769db-ad13-48ed-87ff-8c1f72a2c2a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.161 187161 DEBUG oslo_concurrency.lockutils [req-d5fec42f-d2d8-4f1b-93a7-ee0f941c3457 req-0b9769db-ad13-48ed-87ff-8c1f72a2c2a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.162 187161 DEBUG oslo_concurrency.lockutils [req-d5fec42f-d2d8-4f1b-93a7-ee0f941c3457 req-0b9769db-ad13-48ed-87ff-8c1f72a2c2a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.162 187161 DEBUG nova.compute.manager [req-d5fec42f-d2d8-4f1b-93a7-ee0f941c3457 req-0b9769db-ad13-48ed-87ff-8c1f72a2c2a1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Processing event network-vif-plugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.170 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[184fcc63-f13b-476e-bc87-e8ffe873b81d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422178, 'reachable_time': 17589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213165, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.196 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[62776ce8-0159-4b8c-a2e1-c7d95a82fd71]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:bbc6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422178, 'tstamp': 422178}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213166, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.223 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1b07d4f5-e4b9-4e4a-b36c-8d31b19acf0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422178, 'reachable_time': 17589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213167, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.269 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[80ff2648-590c-4f8a-a036-46384b438f6d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.346 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[44ad5b6e-4993-4ee6-8127-004d6ae62570]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.348 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.349 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.350 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:22 compute-1 NetworkManager[55553]: <info>  [1764720202.3543] manager: (taped11b71b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 03 00:03:22 compute-1 kernel: taped11b71b-70: entered promiscuous mode
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.355 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.357 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.358 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:22 compute-1 ovn_controller[95464]: 2025-12-03T00:03:22Z|00119|binding|INFO|Releasing lport add6ea4f-8836-4bed-8f1e-39e943ccf4b5 from this chassis (sb_readonly=0)
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.384 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.386 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0cffaae2-1984-41d4-9eff-467acf7df4da]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.388 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.388 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.388 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ed11b71b-745b-4f0c-9f09-37d53d166bcb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.388 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.388 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[59fdf82d-3fa2-4295-94aa-c49b555025c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.389 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.389 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a82e41b2-c83b-4b49-b83d-b41c0595039d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.390 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:03:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:22.390 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'env', 'PROCESS_TAG=haproxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed11b71b-745b-4f0c-9f09-37d53d166bcb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.430 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.492 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.494 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.497 187161 INFO nova.virt.libvirt.driver [-] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Instance spawned successfully.
Dec 03 00:03:22 compute-1 nova_compute[187157]: 2025-12-03 00:03:22.498 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:03:22 compute-1 podman[213205]: 2025-12-03 00:03:22.800029133 +0000 UTC m=+0.051128478 container create de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Dec 03 00:03:22 compute-1 systemd[1]: Started libpod-conmon-de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2.scope.
Dec 03 00:03:22 compute-1 podman[213205]: 2025-12-03 00:03:22.773832635 +0000 UTC m=+0.024932000 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:03:22 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:03:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5594c4650ba230212cba7a965d6907f11c2c08c93085055ac2c937a30c8b7d57/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:03:22 compute-1 podman[213205]: 2025-12-03 00:03:22.897075583 +0000 UTC m=+0.148174938 container init de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 03 00:03:22 compute-1 podman[213205]: 2025-12-03 00:03:22.901924079 +0000 UTC m=+0.153023424 container start de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:03:22 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [NOTICE]   (213221) : New worker (213223) forked
Dec 03 00:03:22 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [NOTICE]   (213221) : Loading success.
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.009 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.010 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.010 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.011 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.011 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.011 187161 DEBUG nova.virt.libvirt.driver [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.522 187161 INFO nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Took 9.84 seconds to spawn the instance on the hypervisor.
Dec 03 00:03:23 compute-1 nova_compute[187157]: 2025-12-03 00:03:23.523 187161 DEBUG nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.062 187161 INFO nova.compute.manager [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Took 15.07 seconds to build instance.
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.218 187161 DEBUG nova.compute.manager [req-a3dc7ef9-d027-4ef2-b161-13d6e9e6cf36 req-970e38d4-b33c-42bb-b3ec-a7695e8e1bfa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-plugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.218 187161 DEBUG oslo_concurrency.lockutils [req-a3dc7ef9-d027-4ef2-b161-13d6e9e6cf36 req-970e38d4-b33c-42bb-b3ec-a7695e8e1bfa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.219 187161 DEBUG oslo_concurrency.lockutils [req-a3dc7ef9-d027-4ef2-b161-13d6e9e6cf36 req-970e38d4-b33c-42bb-b3ec-a7695e8e1bfa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.219 187161 DEBUG oslo_concurrency.lockutils [req-a3dc7ef9-d027-4ef2-b161-13d6e9e6cf36 req-970e38d4-b33c-42bb-b3ec-a7695e8e1bfa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.219 187161 DEBUG nova.compute.manager [req-a3dc7ef9-d027-4ef2-b161-13d6e9e6cf36 req-970e38d4-b33c-42bb-b3ec-a7695e8e1bfa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] No waiting events found dispatching network-vif-plugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.219 187161 WARNING nova.compute.manager [req-a3dc7ef9-d027-4ef2-b161-13d6e9e6cf36 req-970e38d4-b33c-42bb-b3ec-a7695e8e1bfa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received unexpected event network-vif-plugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f for instance with vm_state active and task_state None.
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.361 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:24 compute-1 nova_compute[187157]: 2025-12-03 00:03:24.569 187161 DEBUG oslo_concurrency.lockutils [None req-8e8e9a5c-0cdf-4674-95e5-87dd30d3d3d6 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:25 compute-1 podman[213232]: 2025-12-03 00:03:25.213127472 +0000 UTC m=+0.058590467 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:03:27 compute-1 podman[213255]: 2025-12-03 00:03:27.21458919 +0000 UTC m=+0.055623555 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Dec 03 00:03:27 compute-1 nova_compute[187157]: 2025-12-03 00:03:27.433 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:29 compute-1 nova_compute[187157]: 2025-12-03 00:03:29.410 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:31 compute-1 nova_compute[187157]: 2025-12-03 00:03:31.206 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:32 compute-1 nova_compute[187157]: 2025-12-03 00:03:32.435 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:34 compute-1 nova_compute[187157]: 2025-12-03 00:03:34.415 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:34 compute-1 ovn_controller[95464]: 2025-12-03T00:03:34Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:54:f4 10.100.0.13
Dec 03 00:03:34 compute-1 ovn_controller[95464]: 2025-12-03T00:03:34Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:54:f4 10.100.0.13
Dec 03 00:03:35 compute-1 podman[197537]: time="2025-12-03T00:03:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:03:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:03:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:03:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:03:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3073 "" "Go-http-client/1.1"
Dec 03 00:03:35 compute-1 podman[213287]: 2025-12-03 00:03:35.718881817 +0000 UTC m=+0.049184301 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:03:36 compute-1 nova_compute[187157]: 2025-12-03 00:03:36.694 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Creating tmpfile /var/lib/nova/instances/tmpzkdk_x_6 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:03:36 compute-1 nova_compute[187157]: 2025-12-03 00:03:36.696 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:36 compute-1 nova_compute[187157]: 2025-12-03 00:03:36.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:36 compute-1 nova_compute[187157]: 2025-12-03 00:03:36.710 187161 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:03:37 compute-1 nova_compute[187157]: 2025-12-03 00:03:37.437 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:38 compute-1 nova_compute[187157]: 2025-12-03 00:03:38.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:38 compute-1 nova_compute[187157]: 2025-12-03 00:03:38.803 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:39 compute-1 nova_compute[187157]: 2025-12-03 00:03:39.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:39 compute-1 nova_compute[187157]: 2025-12-03 00:03:39.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:39 compute-1 nova_compute[187157]: 2025-12-03 00:03:39.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:39 compute-1 nova_compute[187157]: 2025-12-03 00:03:39.214 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:03:39 compute-1 nova_compute[187157]: 2025-12-03 00:03:39.619 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.252 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.323 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.324 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.376 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.545 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.546 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.569 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.569 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5659MB free_disk=73.13956451416016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.570 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:40 compute-1 nova_compute[187157]: 2025-12-03 00:03:40.570 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:41 compute-1 podman[213320]: 2025-12-03 00:03:41.240953007 +0000 UTC m=+0.080926731 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 03 00:03:41 compute-1 nova_compute[187157]: 2025-12-03 00:03:41.596 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 1c6c7975-72fd-442a-b75f-0baede84a60b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:03:42 compute-1 nova_compute[187157]: 2025-12-03 00:03:42.106 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating resource usage from migration dcc97c02-db8d-4b14-b48e-41025617bbf0
Dec 03 00:03:42 compute-1 nova_compute[187157]: 2025-12-03 00:03:42.106 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Starting to track incoming migration dcc97c02-db8d-4b14-b48e-41025617bbf0 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:03:42 compute-1 nova_compute[187157]: 2025-12-03 00:03:42.440 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:42 compute-1 nova_compute[187157]: 2025-12-03 00:03:42.673 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 219f599c-28c7-4f88-b738-36849b54aeb4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.180 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 1c6c7975-72fd-442a-b75f-0baede84a60b has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.180 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.181 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:03:40 up  1:10,  0 user,  load average: 0.43, 0.33, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_85e2f91a92cf4b5a9d626e8418f17322': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.209 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:03:43 compute-1 podman[213347]: 2025-12-03 00:03:43.231335058 +0000 UTC m=+0.056509930 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.239 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.239 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.256 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.285 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.368 187161 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6c7975-72fd-442a-b75f-0baede84a60b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.374 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:03:43 compute-1 nova_compute[187157]: 2025-12-03 00:03:43.881 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.386 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.386 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.387 187161 DEBUG nova.network.neutron [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.390 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.390 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.820s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.622 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:44 compute-1 nova_compute[187157]: 2025-12-03 00:03:44.893 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.385 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.386 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.386 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.386 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.751 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:45 compute-1 nova_compute[187157]: 2025-12-03 00:03:45.933 187161 DEBUG nova.network.neutron [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating instance_info_cache with network_info: [{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.441 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.461 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6c7975-72fd-442a-b75f-0baede84a60b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.463 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Creating instance directory: /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.464 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Creating disk.info with the contents: {'/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk': 'qcow2', '/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.465 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.466 187161 DEBUG nova.objects.instance [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1c6c7975-72fd-442a-b75f-0baede84a60b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.972 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.978 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:46 compute-1 nova_compute[187157]: 2025-12-03 00:03:46.981 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.060 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.061 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.061 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.062 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.064 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.065 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.145 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.146 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.180 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.181 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.181 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.244 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.245 187161 DEBUG nova.virt.disk.api [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.245 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.315 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.316 187161 DEBUG nova.virt.disk.api [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.316 187161 DEBUG nova.objects.instance [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 1c6c7975-72fd-442a-b75f-0baede84a60b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.442 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.823 187161 DEBUG nova.objects.base [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<1c6c7975-72fd-442a-b75f-0baede84a60b> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.823 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.870 187161 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config 497664" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.871 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.872 187161 DEBUG nova.virt.libvirt.vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-233597543',id=12,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:03:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3he48fro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:03:03Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=1c6c7975-72fd-442a-b75f-0baede84a60b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.873 187161 DEBUG nova.network.os_vif_util [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.873 187161 DEBUG nova.network.os_vif_util [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.874 187161 DEBUG os_vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.874 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.875 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.875 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.876 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.876 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '100969f5-aa2e-5b48-a96a-4a62dae66314', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.877 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.878 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.881 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.881 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b8033d7-62, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.881 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7b8033d7-62, col_values=(('qos', UUID('aa0e229a-2072-4307-80f3-cc04d54c8f36')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.882 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7b8033d7-62, col_values=(('external_ids', {'iface-id': '7b8033d7-6209-4ba1-8605-72623902a9a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:79:f2', 'vm-uuid': '1c6c7975-72fd-442a-b75f-0baede84a60b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.882 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.885 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:03:47 compute-1 NetworkManager[55553]: <info>  [1764720227.8854] manager: (tap7b8033d7-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.889 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.890 187161 INFO os_vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62')
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.890 187161 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.890 187161 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6c7975-72fd-442a-b75f-0baede84a60b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:03:47 compute-1 nova_compute[187157]: 2025-12-03 00:03:47.891 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:48 compute-1 nova_compute[187157]: 2025-12-03 00:03:48.453 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:49 compute-1 nova_compute[187157]: 2025-12-03 00:03:49.268 187161 DEBUG nova.network.neutron [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Port 7b8033d7-6209-4ba1-8605-72623902a9a9 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:03:49 compute-1 nova_compute[187157]: 2025-12-03 00:03:49.288 187161 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6c7975-72fd-442a-b75f-0baede84a60b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: ERROR   00:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: ERROR   00:03:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: ERROR   00:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: ERROR   00:03:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: ERROR   00:03:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:03:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:03:52 compute-1 ovn_controller[95464]: 2025-12-03T00:03:52Z|00120|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.444 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:03:52 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:03:52 compute-1 kernel: tap7b8033d7-62: entered promiscuous mode
Dec 03 00:03:52 compute-1 NetworkManager[55553]: <info>  [1764720232.8333] manager: (tap7b8033d7-62): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.835 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 ovn_controller[95464]: 2025-12-03T00:03:52Z|00121|binding|INFO|Claiming lport 7b8033d7-6209-4ba1-8605-72623902a9a9 for this additional chassis.
Dec 03 00:03:52 compute-1 ovn_controller[95464]: 2025-12-03T00:03:52Z|00122|binding|INFO|7b8033d7-6209-4ba1-8605-72623902a9a9: Claiming fa:16:3e:68:79:f2 10.100.0.14
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.850 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:79:f2 10.100.0.14'], port_security=['fa:16:3e:68:79:f2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1c6c7975-72fd-442a-b75f-0baede84a60b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=7b8033d7-6209-4ba1-8605-72623902a9a9) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.851 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 7b8033d7-6209-4ba1-8605-72623902a9a9 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:03:52 compute-1 ovn_controller[95464]: 2025-12-03T00:03:52Z|00123|binding|INFO|Setting lport 7b8033d7-6209-4ba1-8605-72623902a9a9 ovn-installed in OVS
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.851 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.853 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.860 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.866 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a82927df-83ab-47e5-9172-ff839f0f929d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:52 compute-1 systemd-machined[153454]: New machine qemu-10-instance-0000000c.
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.883 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.899 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb6fc67-779f-4b0b-b9d8-04f97b558915]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:52 compute-1 systemd-udevd[213423]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.906 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[1a59939b-194b-4851-b346-bda090566761]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:52 compute-1 NetworkManager[55553]: <info>  [1764720232.9179] device (tap7b8033d7-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:03:52 compute-1 NetworkManager[55553]: <info>  [1764720232.9192] device (tap7b8033d7-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.932 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[937a397e-dc17-4d18-8efc-7ad2a263d059]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.949 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4a87648c-5bc6-4d15-b25c-ab2b94031f4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422178, 'reachable_time': 23706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213431, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.963 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0e29d56e-7e45-4fbe-b851-31195ad734f5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422195, 'tstamp': 422195}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213434, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422199, 'tstamp': 422199}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213434, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.964 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.965 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 nova_compute[187157]: 2025-12-03 00:03:52.966 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.967 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.967 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.967 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.967 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:03:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:03:52.968 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9449ff7a-d468-4dc1-bac8-b0675b1b7679]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:55 compute-1 ovn_controller[95464]: 2025-12-03T00:03:55Z|00124|binding|INFO|Claiming lport 7b8033d7-6209-4ba1-8605-72623902a9a9 for this chassis.
Dec 03 00:03:55 compute-1 ovn_controller[95464]: 2025-12-03T00:03:55Z|00125|binding|INFO|7b8033d7-6209-4ba1-8605-72623902a9a9: Claiming fa:16:3e:68:79:f2 10.100.0.14
Dec 03 00:03:55 compute-1 ovn_controller[95464]: 2025-12-03T00:03:55Z|00126|binding|INFO|Setting lport 7b8033d7-6209-4ba1-8605-72623902a9a9 up in Southbound
Dec 03 00:03:56 compute-1 podman[213455]: 2025-12-03 00:03:56.222864495 +0000 UTC m=+0.064940530 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Dec 03 00:03:56 compute-1 nova_compute[187157]: 2025-12-03 00:03:56.857 187161 INFO nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Post operation of migration started
Dec 03 00:03:56 compute-1 nova_compute[187157]: 2025-12-03 00:03:56.858 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.446 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.473 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.474 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.560 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.560 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.560 187161 DEBUG nova.network.neutron [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:03:57 compute-1 nova_compute[187157]: 2025-12-03 00:03:57.885 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:58 compute-1 nova_compute[187157]: 2025-12-03 00:03:58.068 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:58 compute-1 podman[213476]: 2025-12-03 00:03:58.246416207 +0000 UTC m=+0.082653943 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 03 00:03:58 compute-1 nova_compute[187157]: 2025-12-03 00:03:58.832 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:58 compute-1 nova_compute[187157]: 2025-12-03 00:03:58.993 187161 DEBUG nova.network.neutron [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating instance_info_cache with network_info: [{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:03:59 compute-1 nova_compute[187157]: 2025-12-03 00:03:59.500 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:04:00 compute-1 nova_compute[187157]: 2025-12-03 00:04:00.017 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:00 compute-1 nova_compute[187157]: 2025-12-03 00:04:00.018 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:00 compute-1 nova_compute[187157]: 2025-12-03 00:04:00.018 187161 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:00 compute-1 nova_compute[187157]: 2025-12-03 00:04:00.022 187161 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:04:00 compute-1 virtqemud[186882]: Domain id=10 name='instance-0000000c' uuid=1c6c7975-72fd-442a-b75f-0baede84a60b is tainted: custom-monitor
Dec 03 00:04:01 compute-1 nova_compute[187157]: 2025-12-03 00:04:01.030 187161 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:04:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:01.718 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:01.719 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:01.720 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:02 compute-1 nova_compute[187157]: 2025-12-03 00:04:02.035 187161 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:04:02 compute-1 nova_compute[187157]: 2025-12-03 00:04:02.039 187161 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:04:02 compute-1 nova_compute[187157]: 2025-12-03 00:04:02.448 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:02 compute-1 nova_compute[187157]: 2025-12-03 00:04:02.551 187161 DEBUG nova.objects.instance [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:04:02 compute-1 nova_compute[187157]: 2025-12-03 00:04:02.887 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:03 compute-1 nova_compute[187157]: 2025-12-03 00:04:03.579 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:03 compute-1 nova_compute[187157]: 2025-12-03 00:04:03.676 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:03 compute-1 nova_compute[187157]: 2025-12-03 00:04:03.676 187161 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:05 compute-1 podman[197537]: time="2025-12-03T00:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:04:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:04:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3077 "" "Go-http-client/1.1"
Dec 03 00:04:06 compute-1 podman[213498]: 2025-12-03 00:04:06.204828852 +0000 UTC m=+0.048708923 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.109 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.110 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.110 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.110 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.110 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.121 187161 INFO nova.compute.manager [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Terminating instance
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.450 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.634 187161 DEBUG nova.compute.manager [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:04:07 compute-1 kernel: tap751b874c-d7 (unregistering): left promiscuous mode
Dec 03 00:04:07 compute-1 NetworkManager[55553]: <info>  [1764720247.6586] device (tap751b874c-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:04:07 compute-1 ovn_controller[95464]: 2025-12-03T00:04:07Z|00127|binding|INFO|Releasing lport 751b874c-d73f-40e4-8f78-0ec9fe6bd11f from this chassis (sb_readonly=0)
Dec 03 00:04:07 compute-1 ovn_controller[95464]: 2025-12-03T00:04:07Z|00128|binding|INFO|Setting lport 751b874c-d73f-40e4-8f78-0ec9fe6bd11f down in Southbound
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.662 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 ovn_controller[95464]: 2025-12-03T00:04:07Z|00129|binding|INFO|Removing iface tap751b874c-d7 ovn-installed in OVS
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.664 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.671 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:54:f4 10.100.0.13'], port_security=['fa:16:3e:2d:54:f4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '219f599c-28c7-4f88-b738-36849b54aeb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=751b874c-d73f-40e4-8f78-0ec9fe6bd11f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.672 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 751b874c-d73f-40e4-8f78-0ec9fe6bd11f in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.673 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.683 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.687 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e58440f1-ca25-49fb-b8db-738a19b4ef63]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.715 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f71d2e-94a9-467b-8be9-955edac02ae5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec 03 00:04:07 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 14.308s CPU time.
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.718 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f606d9-b45b-4608-974c-96ce9e4d6b58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 systemd-machined[153454]: Machine qemu-9-instance-0000000d terminated.
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.743 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1b11ad-b482-4e08-8176-551afc643f3b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.759 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6d145968-ee6c-4bbe-96c8-9b1d2ac1173f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422178, 'reachable_time': 23706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213534, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.774 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[05931d6e-d49f-4844-86c7-0bde955f906e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422195, 'tstamp': 422195}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213535, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422199, 'tstamp': 422199}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213535, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.775 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.776 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.780 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.780 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.780 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.781 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.781 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:04:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:07.782 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf22117-b1c2-4cbc-bc42-2d9c2f285319]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.847 187161 DEBUG nova.compute.manager [req-f3a913e8-d26f-49f0-bc38-c3566248f727 req-d767a450-4a22-4658-9c6d-f9ea66356294 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-unplugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.848 187161 DEBUG oslo_concurrency.lockutils [req-f3a913e8-d26f-49f0-bc38-c3566248f727 req-d767a450-4a22-4658-9c6d-f9ea66356294 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.848 187161 DEBUG oslo_concurrency.lockutils [req-f3a913e8-d26f-49f0-bc38-c3566248f727 req-d767a450-4a22-4658-9c6d-f9ea66356294 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.848 187161 DEBUG oslo_concurrency.lockutils [req-f3a913e8-d26f-49f0-bc38-c3566248f727 req-d767a450-4a22-4658-9c6d-f9ea66356294 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.848 187161 DEBUG nova.compute.manager [req-f3a913e8-d26f-49f0-bc38-c3566248f727 req-d767a450-4a22-4658-9c6d-f9ea66356294 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] No waiting events found dispatching network-vif-unplugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.848 187161 DEBUG nova.compute.manager [req-f3a913e8-d26f-49f0-bc38-c3566248f727 req-d767a450-4a22-4658-9c6d-f9ea66356294 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-unplugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.855 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.860 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.888 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.899 187161 INFO nova.virt.libvirt.driver [-] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Instance destroyed successfully.
Dec 03 00:04:07 compute-1 nova_compute[187157]: 2025-12-03 00:04:07.899 187161 DEBUG nova.objects.instance [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'resources' on Instance uuid 219f599c-28c7-4f88-b738-36849b54aeb4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.410 187161 DEBUG nova.virt.libvirt.vif [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:03:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-739861395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-739861395',id=13,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:03:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3y1rpwrk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:03:23Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=219f599c-28c7-4f88-b738-36849b54aeb4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.411 187161 DEBUG nova.network.os_vif_util [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "address": "fa:16:3e:2d:54:f4", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap751b874c-d7", "ovs_interfaceid": "751b874c-d73f-40e4-8f78-0ec9fe6bd11f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.411 187161 DEBUG nova.network.os_vif_util [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.412 187161 DEBUG os_vif [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.413 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.413 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap751b874c-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.415 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.417 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.417 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.418 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.418 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=dd35b88a-5362-41ad-831f-90edef306941) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.419 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.420 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.421 187161 INFO os_vif [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:f4,bridge_name='br-int',has_traffic_filtering=True,id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap751b874c-d7')
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.421 187161 INFO nova.virt.libvirt.driver [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Deleting instance files /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4_del
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.422 187161 INFO nova.virt.libvirt.driver [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Deletion of /var/lib/nova/instances/219f599c-28c7-4f88-b738-36849b54aeb4_del complete
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.936 187161 INFO nova.compute.manager [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Took 1.30 seconds to destroy the instance on the hypervisor.
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.936 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.937 187161 DEBUG nova.compute.manager [-] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.937 187161 DEBUG nova.network.neutron [-] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:04:08 compute-1 nova_compute[187157]: 2025-12-03 00:04:08.937 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.048 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.487 187161 DEBUG nova.compute.manager [req-a9b6b295-b051-4fa3-a504-4953c5170778 req-669a15b6-6ed0-4675-82f7-6d450be9657a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-deleted-751b874c-d73f-40e4-8f78-0ec9fe6bd11f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.488 187161 INFO nova.compute.manager [req-a9b6b295-b051-4fa3-a504-4953c5170778 req-669a15b6-6ed0-4675-82f7-6d450be9657a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Neutron deleted interface 751b874c-d73f-40e4-8f78-0ec9fe6bd11f; detaching it from the instance and deleting it from the info cache
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.488 187161 DEBUG nova.network.neutron [req-a9b6b295-b051-4fa3-a504-4953c5170778 req-669a15b6-6ed0-4675-82f7-6d450be9657a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.912 187161 DEBUG nova.compute.manager [req-effc4a4a-73c6-42c5-b0cb-d8e28f1a3c7a req-ac1ca34f-07d0-482c-b84d-1ecfee41c801 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-unplugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.912 187161 DEBUG oslo_concurrency.lockutils [req-effc4a4a-73c6-42c5-b0cb-d8e28f1a3c7a req-ac1ca34f-07d0-482c-b84d-1ecfee41c801 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.913 187161 DEBUG oslo_concurrency.lockutils [req-effc4a4a-73c6-42c5-b0cb-d8e28f1a3c7a req-ac1ca34f-07d0-482c-b84d-1ecfee41c801 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.913 187161 DEBUG oslo_concurrency.lockutils [req-effc4a4a-73c6-42c5-b0cb-d8e28f1a3c7a req-ac1ca34f-07d0-482c-b84d-1ecfee41c801 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.913 187161 DEBUG nova.compute.manager [req-effc4a4a-73c6-42c5-b0cb-d8e28f1a3c7a req-ac1ca34f-07d0-482c-b84d-1ecfee41c801 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] No waiting events found dispatching network-vif-unplugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.913 187161 DEBUG nova.compute.manager [req-effc4a4a-73c6-42c5-b0cb-d8e28f1a3c7a req-ac1ca34f-07d0-482c-b84d-1ecfee41c801 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Received event network-vif-unplugged-751b874c-d73f-40e4-8f78-0ec9fe6bd11f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.926 187161 DEBUG nova.network.neutron [-] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:09 compute-1 nova_compute[187157]: 2025-12-03 00:04:09.995 187161 DEBUG nova.compute.manager [req-a9b6b295-b051-4fa3-a504-4953c5170778 req-669a15b6-6ed0-4675-82f7-6d450be9657a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Detach interface failed, port_id=751b874c-d73f-40e4-8f78-0ec9fe6bd11f, reason: Instance 219f599c-28c7-4f88-b738-36849b54aeb4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:04:10 compute-1 nova_compute[187157]: 2025-12-03 00:04:10.431 187161 INFO nova.compute.manager [-] [instance: 219f599c-28c7-4f88-b738-36849b54aeb4] Took 1.49 seconds to deallocate network for instance.
Dec 03 00:04:10 compute-1 nova_compute[187157]: 2025-12-03 00:04:10.968 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:10 compute-1 nova_compute[187157]: 2025-12-03 00:04:10.968 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:11 compute-1 nova_compute[187157]: 2025-12-03 00:04:11.037 187161 DEBUG nova.compute.provider_tree [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:04:11 compute-1 nova_compute[187157]: 2025-12-03 00:04:11.545 187161 DEBUG nova.scheduler.client.report [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:04:12 compute-1 nova_compute[187157]: 2025-12-03 00:04:12.060 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:12 compute-1 nova_compute[187157]: 2025-12-03 00:04:12.082 187161 INFO nova.scheduler.client.report [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Deleted allocations for instance 219f599c-28c7-4f88-b738-36849b54aeb4
Dec 03 00:04:12 compute-1 podman[213551]: 2025-12-03 00:04:12.263235582 +0000 UTC m=+0.097536218 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:04:12 compute-1 nova_compute[187157]: 2025-12-03 00:04:12.451 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.107 187161 DEBUG oslo_concurrency.lockutils [None req-f8eacf05-6de3-4bf5-9fef-7aae099f6bc0 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "219f599c-28c7-4f88-b738-36849b54aeb4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.998s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.420 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.773 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.774 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.774 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.775 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.775 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:13 compute-1 nova_compute[187157]: 2025-12-03 00:04:13.792 187161 INFO nova.compute.manager [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Terminating instance
Dec 03 00:04:14 compute-1 podman[213578]: 2025-12-03 00:04:14.211513047 +0000 UTC m=+0.053571728 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.312 187161 DEBUG nova.compute.manager [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:04:14 compute-1 kernel: tap7b8033d7-62 (unregistering): left promiscuous mode
Dec 03 00:04:14 compute-1 ovn_controller[95464]: 2025-12-03T00:04:14Z|00130|binding|INFO|Releasing lport 7b8033d7-6209-4ba1-8605-72623902a9a9 from this chassis (sb_readonly=0)
Dec 03 00:04:14 compute-1 NetworkManager[55553]: <info>  [1764720254.3388] device (tap7b8033d7-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.339 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 ovn_controller[95464]: 2025-12-03T00:04:14Z|00131|binding|INFO|Setting lport 7b8033d7-6209-4ba1-8605-72623902a9a9 down in Southbound
Dec 03 00:04:14 compute-1 ovn_controller[95464]: 2025-12-03T00:04:14Z|00132|binding|INFO|Removing iface tap7b8033d7-62 ovn-installed in OVS
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.342 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.351 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:79:f2 10.100.0.14'], port_security=['fa:16:3e:68:79:f2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1c6c7975-72fd-442a-b75f-0baede84a60b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '15', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=7b8033d7-6209-4ba1-8605-72623902a9a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.352 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 7b8033d7-6209-4ba1-8605-72623902a9a9 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.354 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed11b71b-745b-4f0c-9f09-37d53d166bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.355 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2653c5-0e9c-45a5-a3a7-a015d6ee6e9b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.356 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace which is not needed anymore
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.361 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 03 00:04:14 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 1.804s CPU time.
Dec 03 00:04:14 compute-1 systemd-machined[153454]: Machine qemu-10-instance-0000000c terminated.
Dec 03 00:04:14 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [NOTICE]   (213221) : haproxy version is 3.0.5-8e879a5
Dec 03 00:04:14 compute-1 podman[213621]: 2025-12-03 00:04:14.489038067 +0000 UTC m=+0.031782858 container kill de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 03 00:04:14 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [NOTICE]   (213221) : path to executable is /usr/sbin/haproxy
Dec 03 00:04:14 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [WARNING]  (213221) : Exiting Master process...
Dec 03 00:04:14 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [ALERT]    (213221) : Current worker (213223) exited with code 143 (Terminated)
Dec 03 00:04:14 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213217]: [WARNING]  (213221) : All workers exited. Exiting... (0)
Dec 03 00:04:14 compute-1 systemd[1]: libpod-de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2.scope: Deactivated successfully.
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.540 187161 DEBUG nova.compute.manager [req-92281d4b-18bb-4032-a9e5-db4c3562f025 req-ae14c06b-b1d1-47f4-a100-77f389197bc9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.540 187161 DEBUG oslo_concurrency.lockutils [req-92281d4b-18bb-4032-a9e5-db4c3562f025 req-ae14c06b-b1d1-47f4-a100-77f389197bc9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.540 187161 DEBUG oslo_concurrency.lockutils [req-92281d4b-18bb-4032-a9e5-db4c3562f025 req-ae14c06b-b1d1-47f4-a100-77f389197bc9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.541 187161 DEBUG oslo_concurrency.lockutils [req-92281d4b-18bb-4032-a9e5-db4c3562f025 req-ae14c06b-b1d1-47f4-a100-77f389197bc9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.541 187161 DEBUG nova.compute.manager [req-92281d4b-18bb-4032-a9e5-db4c3562f025 req-ae14c06b-b1d1-47f4-a100-77f389197bc9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.541 187161 DEBUG nova.compute.manager [req-92281d4b-18bb-4032-a9e5-db4c3562f025 req-ae14c06b-b1d1-47f4-a100-77f389197bc9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.541 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.545 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 podman[213636]: 2025-12-03 00:04:14.547493742 +0000 UTC m=+0.026596405 container died de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 03 00:04:14 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2-userdata-shm.mount: Deactivated successfully.
Dec 03 00:04:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-5594c4650ba230212cba7a965d6907f11c2c08c93085055ac2c937a30c8b7d57-merged.mount: Deactivated successfully.
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.585 187161 INFO nova.virt.libvirt.driver [-] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance destroyed successfully.
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.586 187161 DEBUG nova.objects.instance [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'resources' on Instance uuid 1c6c7975-72fd-442a-b75f-0baede84a60b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:04:14 compute-1 podman[213636]: 2025-12-03 00:04:14.592093666 +0000 UTC m=+0.071196309 container cleanup de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:04:14 compute-1 systemd[1]: libpod-conmon-de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2.scope: Deactivated successfully.
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.604 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.605 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 podman[213638]: 2025-12-03 00:04:14.612137164 +0000 UTC m=+0.092724513 container remove de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.618 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bb82d554-96d2-4f34-957e-34fa6c9103ab]: (4, ("Wed Dec  3 12:04:14 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2)\nde680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2\nWed Dec  3 12:04:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (de680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2)\nde680ccf77eb801e4b15ff8990f64c520784cad3644eb6d3377d0e9cce775fd2\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.620 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[073836f4-1e45-4399-a27e-2ebc529b0ccf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.621 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.621 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6b38d5aa-6057-4a99-9175-3816d70b4bc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.622 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.623 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 kernel: taped11b71b-70: left promiscuous mode
Dec 03 00:04:14 compute-1 nova_compute[187157]: 2025-12-03 00:04:14.644 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.647 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3f918765-8118-4856-9857-766828fe2964]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.656 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[767f835f-9296-47ea-b0d3-1f1646e627f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.658 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f53817-70a8-4f1d-94cf-7879d6f7ad61]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.672 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0338d656-1e39-4a91-9d7a-3b4ee29b2fba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422169, 'reachable_time': 22499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213686, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 systemd[1]: run-netns-ovnmeta\x2ded11b71b\x2d745b\x2d4f0c\x2d9f09\x2d37d53d166bcb.mount: Deactivated successfully.
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.675 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.676 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[139a504d-30d4-4a05-8bfc-9c09d0bf1fcc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:14.676 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.093 187161 DEBUG nova.virt.libvirt.vif [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-233597543',id=12,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:03:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3he48fro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:04:03Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=1c6c7975-72fd-442a-b75f-0baede84a60b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.094 187161 DEBUG nova.network.os_vif_util [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.094 187161 DEBUG nova.network.os_vif_util [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.095 187161 DEBUG os_vif [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.096 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.096 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b8033d7-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.098 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.100 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.101 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.101 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=aa0e229a-2072-4307-80f3-cc04d54c8f36) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.102 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.102 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.104 187161 INFO os_vif [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62')
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.105 187161 INFO nova.virt.libvirt.driver [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Deleting instance files /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b_del
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.105 187161 INFO nova.virt.libvirt.driver [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Deletion of /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b_del complete
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.618 187161 INFO nova.compute.manager [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Took 1.31 seconds to destroy the instance on the hypervisor.
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.618 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.618 187161 DEBUG nova.compute.manager [-] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.618 187161 DEBUG nova.network.neutron [-] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:04:15 compute-1 nova_compute[187157]: 2025-12-03 00:04:15.619 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.451 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.606 187161 DEBUG nova.compute.manager [req-be7dab91-cbb2-4778-8266-5e95fb73673c req-f4ce1bb8-9e39-47fc-86d6-1dec6d9f05d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.607 187161 DEBUG oslo_concurrency.lockutils [req-be7dab91-cbb2-4778-8266-5e95fb73673c req-f4ce1bb8-9e39-47fc-86d6-1dec6d9f05d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.607 187161 DEBUG oslo_concurrency.lockutils [req-be7dab91-cbb2-4778-8266-5e95fb73673c req-f4ce1bb8-9e39-47fc-86d6-1dec6d9f05d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.607 187161 DEBUG oslo_concurrency.lockutils [req-be7dab91-cbb2-4778-8266-5e95fb73673c req-f4ce1bb8-9e39-47fc-86d6-1dec6d9f05d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.607 187161 DEBUG nova.compute.manager [req-be7dab91-cbb2-4778-8266-5e95fb73673c req-f4ce1bb8-9e39-47fc-86d6-1dec6d9f05d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:04:16 compute-1 nova_compute[187157]: 2025-12-03 00:04:16.608 187161 DEBUG nova.compute.manager [req-be7dab91-cbb2-4778-8266-5e95fb73673c req-f4ce1bb8-9e39-47fc-86d6-1dec6d9f05d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:04:17 compute-1 nova_compute[187157]: 2025-12-03 00:04:17.452 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:17 compute-1 nova_compute[187157]: 2025-12-03 00:04:17.864 187161 DEBUG nova.network.neutron [-] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:18 compute-1 nova_compute[187157]: 2025-12-03 00:04:18.371 187161 INFO nova.compute.manager [-] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Took 2.75 seconds to deallocate network for instance.
Dec 03 00:04:18 compute-1 nova_compute[187157]: 2025-12-03 00:04:18.676 187161 DEBUG nova.compute.manager [req-3d09d223-f3a1-478e-8bdb-9533fbbf743c req-4858bf1e-4e0b-45b1-b6ba-2d09424532c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-deleted-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:18 compute-1 nova_compute[187157]: 2025-12-03 00:04:18.896 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:18 compute-1 nova_compute[187157]: 2025-12-03 00:04:18.896 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:18 compute-1 nova_compute[187157]: 2025-12-03 00:04:18.901 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:18 compute-1 nova_compute[187157]: 2025-12-03 00:04:18.934 187161 INFO nova.scheduler.client.report [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Deleted allocations for instance 1c6c7975-72fd-442a-b75f-0baede84a60b
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: ERROR   00:04:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: ERROR   00:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: ERROR   00:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: ERROR   00:04:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: ERROR   00:04:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:04:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:04:19 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:19.677 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:19 compute-1 nova_compute[187157]: 2025-12-03 00:04:19.960 187161 DEBUG oslo_concurrency.lockutils [None req-029336a6-7da2-4bf9-84df-5c3ade3eecc4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.186s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:20 compute-1 nova_compute[187157]: 2025-12-03 00:04:20.105 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:22 compute-1 nova_compute[187157]: 2025-12-03 00:04:22.453 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:25 compute-1 nova_compute[187157]: 2025-12-03 00:04:25.107 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:27 compute-1 podman[213688]: 2025-12-03 00:04:27.208587436 +0000 UTC m=+0.053816556 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:04:27 compute-1 nova_compute[187157]: 2025-12-03 00:04:27.455 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:29 compute-1 podman[213710]: 2025-12-03 00:04:29.21168567 +0000 UTC m=+0.053888997 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:04:30 compute-1 nova_compute[187157]: 2025-12-03 00:04:30.109 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:31 compute-1 nova_compute[187157]: 2025-12-03 00:04:31.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:32 compute-1 nova_compute[187157]: 2025-12-03 00:04:32.456 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-1 nova_compute[187157]: 2025-12-03 00:04:35.111 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-1 podman[197537]: time="2025-12-03T00:04:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:04:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:04:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:04:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:04:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2615 "" "Go-http-client/1.1"
Dec 03 00:04:36 compute-1 nova_compute[187157]: 2025-12-03 00:04:36.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:37 compute-1 podman[213731]: 2025-12-03 00:04:37.221108 +0000 UTC m=+0.061523678 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:04:37 compute-1 nova_compute[187157]: 2025-12-03 00:04:37.459 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:39 compute-1 nova_compute[187157]: 2025-12-03 00:04:39.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:40 compute-1 nova_compute[187157]: 2025-12-03 00:04:40.113 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:40 compute-1 nova_compute[187157]: 2025-12-03 00:04:40.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:40 compute-1 nova_compute[187157]: 2025-12-03 00:04:40.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.366 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.367 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.383 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.384 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5854MB free_disk=73.16822052001953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.384 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:41 compute-1 nova_compute[187157]: 2025-12-03 00:04:41.385 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:42 compute-1 nova_compute[187157]: 2025-12-03 00:04:42.430 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:04:42 compute-1 nova_compute[187157]: 2025-12-03 00:04:42.430 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:04:41 up  1:11,  0 user,  load average: 0.28, 0.32, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:04:42 compute-1 nova_compute[187157]: 2025-12-03 00:04:42.461 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:42 compute-1 nova_compute[187157]: 2025-12-03 00:04:42.481 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:04:42 compute-1 nova_compute[187157]: 2025-12-03 00:04:42.987 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:04:43 compute-1 podman[213758]: 2025-12-03 00:04:43.249648559 +0000 UTC m=+0.084343563 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 03 00:04:43 compute-1 nova_compute[187157]: 2025-12-03 00:04:43.496 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:04:43 compute-1 nova_compute[187157]: 2025-12-03 00:04:43.497 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:44 compute-1 nova_compute[187157]: 2025-12-03 00:04:44.569 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:44 compute-1 nova_compute[187157]: 2025-12-03 00:04:44.569 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:45 compute-1 nova_compute[187157]: 2025-12-03 00:04:45.073 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:04:45 compute-1 nova_compute[187157]: 2025-12-03 00:04:45.115 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:45 compute-1 podman[213784]: 2025-12-03 00:04:45.218296961 +0000 UTC m=+0.052284259 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 03 00:04:45 compute-1 nova_compute[187157]: 2025-12-03 00:04:45.617 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:45 compute-1 nova_compute[187157]: 2025-12-03 00:04:45.618 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:45 compute-1 nova_compute[187157]: 2025-12-03 00:04:45.624 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:04:45 compute-1 nova_compute[187157]: 2025-12-03 00:04:45.624 187161 INFO nova.compute.claims [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:04:46 compute-1 sshd-session[213803]: Invalid user sol from 193.32.162.146 port 38712
Dec 03 00:04:46 compute-1 nova_compute[187157]: 2025-12-03 00:04:46.673 187161 DEBUG nova.compute.provider_tree [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:04:46 compute-1 sshd-session[213803]: Connection closed by invalid user sol 193.32.162.146 port 38712 [preauth]
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.179 187161 DEBUG nova.scheduler.client.report [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.462 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.496 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.497 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.497 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.497 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.690 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:47 compute-1 nova_compute[187157]: 2025-12-03 00:04:47.691 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:04:48 compute-1 nova_compute[187157]: 2025-12-03 00:04:48.200 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:04:48 compute-1 nova_compute[187157]: 2025-12-03 00:04:48.200 187161 DEBUG nova.network.neutron [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:04:48 compute-1 nova_compute[187157]: 2025-12-03 00:04:48.201 187161 WARNING neutronclient.v2_0.client [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:48 compute-1 nova_compute[187157]: 2025-12-03 00:04:48.201 187161 WARNING neutronclient.v2_0.client [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:48 compute-1 nova_compute[187157]: 2025-12-03 00:04:48.707 187161 INFO nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:04:48 compute-1 nova_compute[187157]: 2025-12-03 00:04:48.749 187161 DEBUG nova.network.neutron [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Successfully created port: 665e4d53-decd-4665-b2a5-879bf031819d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.214 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.361 187161 DEBUG nova.network.neutron [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Successfully updated port: 665e4d53-decd-4665-b2a5-879bf031819d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.414 187161 DEBUG nova.compute.manager [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-changed-665e4d53-decd-4665-b2a5-879bf031819d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.414 187161 DEBUG nova.compute.manager [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Refreshing instance network info cache due to event network-changed-665e4d53-decd-4665-b2a5-879bf031819d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.414 187161 DEBUG oslo_concurrency.lockutils [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-6d03c9dd-e243-47ab-abb5-8a2a5387297d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.414 187161 DEBUG oslo_concurrency.lockutils [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-6d03c9dd-e243-47ab-abb5-8a2a5387297d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.415 187161 DEBUG nova.network.neutron [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Refreshing network info cache for port 665e4d53-decd-4665-b2a5-879bf031819d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: ERROR   00:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: ERROR   00:04:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: ERROR   00:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: ERROR   00:04:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: ERROR   00:04:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:04:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.697 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.868 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "refresh_cache-6d03c9dd-e243-47ab-abb5-8a2a5387297d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:04:49 compute-1 nova_compute[187157]: 2025-12-03 00:04:49.920 187161 WARNING neutronclient.v2_0.client [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.116 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.230 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.232 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.232 187161 INFO nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Creating image(s)
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.233 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "/var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.233 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.234 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.234 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.237 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.238 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.319 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.320 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.320 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.321 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.323 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.324 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.372 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.373 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.403 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.404 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.084s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.405 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.457 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.458 187161 DEBUG nova.virt.disk.api [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Checking if we can resize image /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.458 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.484 187161 DEBUG nova.network.neutron [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.520 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.521 187161 DEBUG nova.virt.disk.api [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Cannot resize image /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.521 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.522 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Ensure instance console log exists: /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.522 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.523 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.523 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:50 compute-1 nova_compute[187157]: 2025-12-03 00:04:50.715 187161 DEBUG nova.network.neutron [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:51 compute-1 nova_compute[187157]: 2025-12-03 00:04:51.222 187161 DEBUG oslo_concurrency.lockutils [req-9f5bfc8b-bd4b-4e61-8a0d-2841e7cab315 req-7805770b-09c8-45f0-92b9-8baeffbedd67 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-6d03c9dd-e243-47ab-abb5-8a2a5387297d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:04:51 compute-1 nova_compute[187157]: 2025-12-03 00:04:51.224 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquired lock "refresh_cache-6d03c9dd-e243-47ab-abb5-8a2a5387297d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:04:51 compute-1 nova_compute[187157]: 2025-12-03 00:04:51.224 187161 DEBUG nova.network.neutron [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:04:52 compute-1 nova_compute[187157]: 2025-12-03 00:04:52.064 187161 DEBUG nova.network.neutron [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:04:52 compute-1 nova_compute[187157]: 2025-12-03 00:04:52.400 187161 WARNING neutronclient.v2_0.client [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:52 compute-1 nova_compute[187157]: 2025-12-03 00:04:52.468 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:52 compute-1 nova_compute[187157]: 2025-12-03 00:04:52.532 187161 DEBUG nova.network.neutron [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Updating instance_info_cache with network_info: [{"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.044 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Releasing lock "refresh_cache-6d03c9dd-e243-47ab-abb5-8a2a5387297d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.044 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Instance network_info: |[{"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.046 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Start _get_guest_xml network_info=[{"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.050 187161 WARNING nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.051 187161 DEBUG nova.virt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-2014606199', uuid='6d03c9dd-e243-47ab-abb5-8a2a5387297d'), owner=OwnerMeta(userid='ab182b4a69794d1fa103fbd3d503df99', username='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin', projectid='85e2f91a92cf4b5a9d626e8418f17322', projectname='tempest-TestExecuteHostMaintenanceStrategy-1767783627'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720293.0518966) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.063 187161 DEBUG nova.virt.libvirt.host [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.063 187161 DEBUG nova.virt.libvirt.host [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.066 187161 DEBUG nova.virt.libvirt.host [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.067 187161 DEBUG nova.virt.libvirt.host [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.068 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.068 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.068 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.069 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.069 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.069 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.069 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.069 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.070 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.070 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.070 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.070 187161 DEBUG nova.virt.hardware [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.075 187161 DEBUG nova.virt.libvirt.vif [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2014606199',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2014606199',id=15,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3rl8nbar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:04:49Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=6d03c9dd-e243-47ab-abb5-8a2a5387297d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.075 187161 DEBUG nova.network.os_vif_util [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.076 187161 DEBUG nova.network.os_vif_util [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.076 187161 DEBUG nova.objects.instance [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d03c9dd-e243-47ab-abb5-8a2a5387297d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.587 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <uuid>6d03c9dd-e243-47ab-abb5-8a2a5387297d</uuid>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <name>instance-0000000f</name>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-2014606199</nova:name>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:04:53</nova:creationTime>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:04:53 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:04:53 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         <nova:port uuid="665e4d53-decd-4665-b2a5-879bf031819d">
Dec 03 00:04:53 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <system>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <entry name="serial">6d03c9dd-e243-47ab-abb5-8a2a5387297d</entry>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <entry name="uuid">6d03c9dd-e243-47ab-abb5-8a2a5387297d</entry>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </system>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <os>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </os>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <features>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </features>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.config"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:e6:bc:36"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <target dev="tap665e4d53-de"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/console.log" append="off"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <video>
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </video>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:04:53 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:04:53 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:04:53 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:04:53 compute-1 nova_compute[187157]: </domain>
Dec 03 00:04:53 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.589 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Preparing to wait for external event network-vif-plugged-665e4d53-decd-4665-b2a5-879bf031819d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.590 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.590 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.590 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.591 187161 DEBUG nova.virt.libvirt.vif [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2014606199',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2014606199',id=15,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3rl8nbar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:04:49Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=6d03c9dd-e243-47ab-abb5-8a2a5387297d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.591 187161 DEBUG nova.network.os_vif_util [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.592 187161 DEBUG nova.network.os_vif_util [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.592 187161 DEBUG os_vif [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.593 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.593 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.594 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.595 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.596 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f487ed48-1d59-5cbc-b38f-65e273685635', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.597 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.598 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.600 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.601 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap665e4d53-de, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.601 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap665e4d53-de, col_values=(('qos', UUID('cdbb433a-e623-4d4f-b312-2fad77af1962')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.601 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap665e4d53-de, col_values=(('external_ids', {'iface-id': '665e4d53-decd-4665-b2a5-879bf031819d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:bc:36', 'vm-uuid': '6d03c9dd-e243-47ab-abb5-8a2a5387297d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.602 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 NetworkManager[55553]: <info>  [1764720293.6036] manager: (tap665e4d53-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.604 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.611 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:53 compute-1 nova_compute[187157]: 2025-12-03 00:04:53.612 187161 INFO os_vif [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de')
Dec 03 00:04:55 compute-1 nova_compute[187157]: 2025-12-03 00:04:55.186 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:04:55 compute-1 nova_compute[187157]: 2025-12-03 00:04:55.186 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:04:55 compute-1 nova_compute[187157]: 2025-12-03 00:04:55.187 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No VIF found with MAC fa:16:3e:e6:bc:36, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:04:55 compute-1 nova_compute[187157]: 2025-12-03 00:04:55.187 187161 INFO nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Using config drive
Dec 03 00:04:55 compute-1 nova_compute[187157]: 2025-12-03 00:04:55.698 187161 WARNING neutronclient.v2_0.client [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.587 187161 INFO nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Creating config drive at /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.config
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.594 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpsw8n7yy6 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.721 187161 DEBUG oslo_concurrency.processutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpsw8n7yy6" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:56 compute-1 kernel: tap665e4d53-de: entered promiscuous mode
Dec 03 00:04:56 compute-1 NetworkManager[55553]: <info>  [1764720296.8142] manager: (tap665e4d53-de): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec 03 00:04:56 compute-1 ovn_controller[95464]: 2025-12-03T00:04:56Z|00133|binding|INFO|Claiming lport 665e4d53-decd-4665-b2a5-879bf031819d for this chassis.
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.812 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:56 compute-1 ovn_controller[95464]: 2025-12-03T00:04:56Z|00134|binding|INFO|665e4d53-decd-4665-b2a5-879bf031819d: Claiming fa:16:3e:e6:bc:36 10.100.0.10
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.827 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:bc:36 10.100.0.10'], port_security=['fa:16:3e:e6:bc:36 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6d03c9dd-e243-47ab-abb5-8a2a5387297d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=665e4d53-decd-4665-b2a5-879bf031819d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:04:56 compute-1 ovn_controller[95464]: 2025-12-03T00:04:56Z|00135|binding|INFO|Setting lport 665e4d53-decd-4665-b2a5-879bf031819d ovn-installed in OVS
Dec 03 00:04:56 compute-1 ovn_controller[95464]: 2025-12-03T00:04:56Z|00136|binding|INFO|Setting lport 665e4d53-decd-4665-b2a5-879bf031819d up in Southbound
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.828 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.828 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 665e4d53-decd-4665-b2a5-879bf031819d in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb bound to our chassis
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.830 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.831 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:56 compute-1 nova_compute[187157]: 2025-12-03 00:04:56.831 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:56 compute-1 systemd-udevd[213841]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.844 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9c55bc1a-4791-48cf-a5aa-82f633efa078]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.845 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped11b71b-71 in ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.848 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped11b71b-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.848 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[beb7625e-6aec-4ef8-b643-5111599b9d67]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.850 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f0350399-2437-4b25-8600-b991446b70b5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 systemd-machined[153454]: New machine qemu-11-instance-0000000f.
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.861 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[a2865061-613c-445e-9cd0-9d0d62430b4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 NetworkManager[55553]: <info>  [1764720296.8631] device (tap665e4d53-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:04:56 compute-1 NetworkManager[55553]: <info>  [1764720296.8642] device (tap665e4d53-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:04:56 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.878 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6d63951d-0beb-453c-b9be-e5a24bf201dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.909 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d88a1403-7059-47cd-b229-4d36e002720a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.914 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd3695e-8bbc-4eb9-90d7-0314f5fec664]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 NetworkManager[55553]: <info>  [1764720296.9151] manager: (taped11b71b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.941 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8652dd6b-f21d-4dbe-871b-ac235c229e6d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.943 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[336357ea-87b6-4882-9e3c-b0444f42950c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 NetworkManager[55553]: <info>  [1764720296.9619] device (taped11b71b-70): carrier: link connected
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.965 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d3644c74-de4b-441a-85b2-e3dd8aa04356]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.979 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ab162f56-9abc-421e-8def-d3be698721de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431661, 'reachable_time': 26201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213875, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:56.994 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0b4246cf-e6b4-48ee-802b-c425f8ef0b87]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:bbc6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431661, 'tstamp': 431661}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213876, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.007 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebe5038-512a-4b96-b7cd-04d1800eb7a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431661, 'reachable_time': 26201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213877, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.041 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1b19e577-d76c-4b16-8b31-5fe62c041d2f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.097 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[21173468-fb0a-45ed-a06b-c47264129636]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.098 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.098 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.098 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:57 compute-1 NetworkManager[55553]: <info>  [1764720297.1213] manager: (taped11b71b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.120 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:57 compute-1 kernel: taped11b71b-70: entered promiscuous mode
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.124 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.124 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.125 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:57 compute-1 ovn_controller[95464]: 2025-12-03T00:04:57Z|00137|binding|INFO|Releasing lport add6ea4f-8836-4bed-8f1e-39e943ccf4b5 from this chassis (sb_readonly=0)
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.127 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.128 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa4c5c6-b8d8-4fc9-b06e-4046c71a1f3a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.130 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.131 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.131 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ed11b71b-745b-4f0c-9f09-37d53d166bcb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.131 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.131 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[998a7958-b228-4692-8f25-5cf2c04ca376]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.132 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.132 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[533abb00-c360-4344-95d3-f9e4399b289e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.133 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:04:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:04:57.133 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'env', 'PROCESS_TAG=haproxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed11b71b-745b-4f0c-9f09-37d53d166bcb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.140 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.166 187161 DEBUG nova.compute.manager [req-be95669b-c952-492d-9d57-1ace0212c42d req-f34d3a90-3932-499c-97c3-4910c60374df 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-plugged-665e4d53-decd-4665-b2a5-879bf031819d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.167 187161 DEBUG oslo_concurrency.lockutils [req-be95669b-c952-492d-9d57-1ace0212c42d req-f34d3a90-3932-499c-97c3-4910c60374df 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.167 187161 DEBUG oslo_concurrency.lockutils [req-be95669b-c952-492d-9d57-1ace0212c42d req-f34d3a90-3932-499c-97c3-4910c60374df 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.167 187161 DEBUG oslo_concurrency.lockutils [req-be95669b-c952-492d-9d57-1ace0212c42d req-f34d3a90-3932-499c-97c3-4910c60374df 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.168 187161 DEBUG nova.compute.manager [req-be95669b-c952-492d-9d57-1ace0212c42d req-f34d3a90-3932-499c-97c3-4910c60374df 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Processing event network-vif-plugged-665e4d53-decd-4665-b2a5-879bf031819d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.391 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.393 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.396 187161 INFO nova.virt.libvirt.driver [-] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Instance spawned successfully.
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.397 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.472 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:57 compute-1 podman[213916]: 2025-12-03 00:04:57.531225411 +0000 UTC m=+0.048675372 container create 28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 03 00:04:57 compute-1 systemd[1]: Started libpod-conmon-28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f.scope.
Dec 03 00:04:57 compute-1 podman[213916]: 2025-12-03 00:04:57.508016057 +0000 UTC m=+0.025466038 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:04:57 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:04:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d25f39ef4c66ce9ed57fe02963525cdc3213770c3a55a0ad4d9b72831cfb6f16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:04:57 compute-1 podman[213916]: 2025-12-03 00:04:57.635023527 +0000 UTC m=+0.152473508 container init 28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 03 00:04:57 compute-1 podman[213929]: 2025-12-03 00:04:57.638357516 +0000 UTC m=+0.078748329 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:04:57 compute-1 podman[213916]: 2025-12-03 00:04:57.645912367 +0000 UTC m=+0.163362328 container start 28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 03 00:04:57 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [NOTICE]   (213957) : New worker (213960) forked
Dec 03 00:04:57 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [NOTICE]   (213957) : Loading success.
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.908 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.909 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.909 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.910 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.910 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:57 compute-1 nova_compute[187157]: 2025-12-03 00:04:57.911 187161 DEBUG nova.virt.libvirt.driver [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:58 compute-1 nova_compute[187157]: 2025-12-03 00:04:58.424 187161 INFO nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Took 8.19 seconds to spawn the instance on the hypervisor.
Dec 03 00:04:58 compute-1 nova_compute[187157]: 2025-12-03 00:04:58.426 187161 DEBUG nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:04:58 compute-1 nova_compute[187157]: 2025-12-03 00:04:58.603 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:58 compute-1 nova_compute[187157]: 2025-12-03 00:04:58.960 187161 INFO nova.compute.manager [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Took 13.38 seconds to build instance.
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.234 187161 DEBUG nova.compute.manager [req-d1ca3736-d0ca-4660-bf82-6a6f5fd4cc37 req-3a723438-2a24-4ff7-aeee-18402a4fcc40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-plugged-665e4d53-decd-4665-b2a5-879bf031819d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.235 187161 DEBUG oslo_concurrency.lockutils [req-d1ca3736-d0ca-4660-bf82-6a6f5fd4cc37 req-3a723438-2a24-4ff7-aeee-18402a4fcc40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.235 187161 DEBUG oslo_concurrency.lockutils [req-d1ca3736-d0ca-4660-bf82-6a6f5fd4cc37 req-3a723438-2a24-4ff7-aeee-18402a4fcc40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.236 187161 DEBUG oslo_concurrency.lockutils [req-d1ca3736-d0ca-4660-bf82-6a6f5fd4cc37 req-3a723438-2a24-4ff7-aeee-18402a4fcc40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.236 187161 DEBUG nova.compute.manager [req-d1ca3736-d0ca-4660-bf82-6a6f5fd4cc37 req-3a723438-2a24-4ff7-aeee-18402a4fcc40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] No waiting events found dispatching network-vif-plugged-665e4d53-decd-4665-b2a5-879bf031819d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.236 187161 WARNING nova.compute.manager [req-d1ca3736-d0ca-4660-bf82-6a6f5fd4cc37 req-3a723438-2a24-4ff7-aeee-18402a4fcc40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received unexpected event network-vif-plugged-665e4d53-decd-4665-b2a5-879bf031819d for instance with vm_state active and task_state None.
Dec 03 00:04:59 compute-1 nova_compute[187157]: 2025-12-03 00:04:59.466 187161 DEBUG oslo_concurrency.lockutils [None req-7356511f-f204-4d99-9069-ac25dfae45cd ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.897s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:00 compute-1 podman[213969]: 2025-12-03 00:05:00.252394373 +0000 UTC m=+0.086717710 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 03 00:05:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:01.722 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:01.722 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:01.723 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:02 compute-1 nova_compute[187157]: 2025-12-03 00:05:02.474 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:03 compute-1 nova_compute[187157]: 2025-12-03 00:05:03.607 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:05 compute-1 podman[197537]: time="2025-12-03T00:05:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:05:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:05:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18287 "" "Go-http-client/1.1"
Dec 03 00:05:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:05:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3075 "" "Go-http-client/1.1"
Dec 03 00:05:07 compute-1 nova_compute[187157]: 2025-12-03 00:05:07.475 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:08 compute-1 podman[213992]: 2025-12-03 00:05:08.200114124 +0000 UTC m=+0.047045154 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:05:08 compute-1 nova_compute[187157]: 2025-12-03 00:05:08.609 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:10 compute-1 ovn_controller[95464]: 2025-12-03T00:05:10Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:bc:36 10.100.0.10
Dec 03 00:05:10 compute-1 ovn_controller[95464]: 2025-12-03T00:05:10Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:bc:36 10.100.0.10
Dec 03 00:05:12 compute-1 nova_compute[187157]: 2025-12-03 00:05:12.477 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:13 compute-1 nova_compute[187157]: 2025-12-03 00:05:13.613 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:14 compute-1 podman[214033]: 2025-12-03 00:05:14.311288283 +0000 UTC m=+0.145374719 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:05:14 compute-1 nova_compute[187157]: 2025-12-03 00:05:14.711 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Creating tmpfile /var/lib/nova/instances/tmp6w8eatjn to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:05:14 compute-1 nova_compute[187157]: 2025-12-03 00:05:14.712 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:14 compute-1 nova_compute[187157]: 2025-12-03 00:05:14.725 187161 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:05:16 compute-1 podman[214061]: 2025-12-03 00:05:16.203594444 +0000 UTC m=+0.044996684 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:05:16 compute-1 nova_compute[187157]: 2025-12-03 00:05:16.794 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:17 compute-1 nova_compute[187157]: 2025-12-03 00:05:17.480 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:18 compute-1 nova_compute[187157]: 2025-12-03 00:05:18.616 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: ERROR   00:05:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: ERROR   00:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: ERROR   00:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: ERROR   00:05:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: ERROR   00:05:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:05:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:05:20 compute-1 nova_compute[187157]: 2025-12-03 00:05:20.842 187161 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d86e858-6a62-411e-a8dc-dffcfa247bfc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:05:21 compute-1 nova_compute[187157]: 2025-12-03 00:05:21.858 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:05:21 compute-1 nova_compute[187157]: 2025-12-03 00:05:21.859 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:05:21 compute-1 nova_compute[187157]: 2025-12-03 00:05:21.859 187161 DEBUG nova.network.neutron [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:05:22 compute-1 nova_compute[187157]: 2025-12-03 00:05:22.370 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:22 compute-1 nova_compute[187157]: 2025-12-03 00:05:22.482 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:23 compute-1 nova_compute[187157]: 2025-12-03 00:05:23.464 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:23 compute-1 nova_compute[187157]: 2025-12-03 00:05:23.619 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:23 compute-1 nova_compute[187157]: 2025-12-03 00:05:23.657 187161 DEBUG nova.network.neutron [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.163 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.178 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d86e858-6a62-411e-a8dc-dffcfa247bfc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.179 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Creating instance directory: /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.179 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Creating disk.info with the contents: {'/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk': 'qcow2', '/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.180 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.180 187161 DEBUG nova.objects.instance [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5d86e858-6a62-411e-a8dc-dffcfa247bfc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.687 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.691 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.693 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.744 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.745 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.746 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.746 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.749 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.749 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.800 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.801 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.834 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.835 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.836 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.889 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.890 187161 DEBUG nova.virt.disk.api [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.891 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.950 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.952 187161 DEBUG nova.virt.disk.api [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:05:24 compute-1 nova_compute[187157]: 2025-12-03 00:05:24.952 187161 DEBUG nova.objects.instance [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 5d86e858-6a62-411e-a8dc-dffcfa247bfc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.463 187161 DEBUG nova.objects.base [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<5d86e858-6a62-411e-a8dc-dffcfa247bfc> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.465 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.496 187161 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config 497664" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.498 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.501 187161 DEBUG nova.virt.libvirt.vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:04:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1547033723',id=14,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:04:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-rbawllbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:04:40Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.502 187161 DEBUG nova.network.os_vif_util [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.504 187161 DEBUG nova.network.os_vif_util [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.505 187161 DEBUG os_vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.508 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.510 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.510 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.512 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.512 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b367e7bd-c8dd-5599-9ab1-45aab2851543', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.514 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.515 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.518 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.519 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3fc60c87-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.519 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3fc60c87-00, col_values=(('qos', UUID('18e9b95a-621e-4ef0-ac8e-b03b0161d32a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.520 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3fc60c87-00, col_values=(('external_ids', {'iface-id': '3fc60c87-0094-403e-9fb0-564004da22b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:ee:e6', 'vm-uuid': '5d86e858-6a62-411e-a8dc-dffcfa247bfc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.521 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 NetworkManager[55553]: <info>  [1764720325.5219] manager: (tap3fc60c87-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.523 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.527 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.528 187161 INFO os_vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00')
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.528 187161 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.528 187161 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d86e858-6a62-411e-a8dc-dffcfa247bfc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.529 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:25 compute-1 nova_compute[187157]: 2025-12-03 00:05:25.673 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:26 compute-1 ovn_controller[95464]: 2025-12-03T00:05:26Z|00138|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:05:27 compute-1 nova_compute[187157]: 2025-12-03 00:05:27.519 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:27 compute-1 nova_compute[187157]: 2025-12-03 00:05:27.942 187161 DEBUG nova.network.neutron [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Port 3fc60c87-0094-403e-9fb0-564004da22b1 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:05:27 compute-1 nova_compute[187157]: 2025-12-03 00:05:27.954 187161 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d86e858-6a62-411e-a8dc-dffcfa247bfc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:05:28 compute-1 podman[214100]: 2025-12-03 00:05:28.214301285 +0000 UTC m=+0.054447740 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Dec 03 00:05:31 compute-1 nova_compute[187157]: 2025-12-03 00:05:31.047 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:31 compute-1 podman[214123]: 2025-12-03 00:05:31.214216388 +0000 UTC m=+0.059078201 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:05:32 compute-1 kernel: tap3fc60c87-00: entered promiscuous mode
Dec 03 00:05:32 compute-1 NetworkManager[55553]: <info>  [1764720332.1316] manager: (tap3fc60c87-00): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Dec 03 00:05:32 compute-1 ovn_controller[95464]: 2025-12-03T00:05:32Z|00139|binding|INFO|Claiming lport 3fc60c87-0094-403e-9fb0-564004da22b1 for this additional chassis.
Dec 03 00:05:32 compute-1 ovn_controller[95464]: 2025-12-03T00:05:32Z|00140|binding|INFO|3fc60c87-0094-403e-9fb0-564004da22b1: Claiming fa:16:3e:6d:ee:e6 10.100.0.11
Dec 03 00:05:32 compute-1 nova_compute[187157]: 2025-12-03 00:05:32.160 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:32 compute-1 systemd-udevd[214155]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:05:32 compute-1 NetworkManager[55553]: <info>  [1764720332.1780] device (tap3fc60c87-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:05:32 compute-1 NetworkManager[55553]: <info>  [1764720332.1800] device (tap3fc60c87-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:05:32 compute-1 ovn_controller[95464]: 2025-12-03T00:05:32Z|00141|binding|INFO|Setting lport 3fc60c87-0094-403e-9fb0-564004da22b1 ovn-installed in OVS
Dec 03 00:05:32 compute-1 nova_compute[187157]: 2025-12-03 00:05:32.189 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:32 compute-1 systemd-machined[153454]: New machine qemu-12-instance-0000000e.
Dec 03 00:05:32 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-0000000e.
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.243 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:ee:e6 10.100.0.11'], port_security=['fa:16:3e:6d:ee:e6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5d86e858-6a62-411e-a8dc-dffcfa247bfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3fc60c87-0094-403e-9fb0-564004da22b1) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.244 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 3fc60c87-0094-403e-9fb0-564004da22b1 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.246 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.267 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4dfe4b-9817-4e53-8a9d-658beabccc9f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.305 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f0c5e9-4d17-45a6-8a01-5f09d51c3e5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.309 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d0501b1d-483b-4515-886f-1c3536a3a0e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.343 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[057c7fa3-78a3-4984-904f-9be61a3e22d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.363 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e002c7-206c-42c4-b6b8-2b6c195ec758]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431661, 'reachable_time': 26201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214172, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.383 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7499c446-a2ac-4a23-9d79-90dbd2d1333f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431671, 'tstamp': 431671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214173, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431674, 'tstamp': 431674}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214173, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.384 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:32 compute-1 nova_compute[187157]: 2025-12-03 00:05:32.385 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:32 compute-1 nova_compute[187157]: 2025-12-03 00:05:32.386 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.387 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.387 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.387 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.387 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:05:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:32.388 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0dea2e10-a335-4924-8ad7-6621066a3a0a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:32 compute-1 nova_compute[187157]: 2025-12-03 00:05:32.520 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:32 compute-1 nova_compute[187157]: 2025-12-03 00:05:32.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:34 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:34.622 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:05:34 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:34.623 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:05:34 compute-1 nova_compute[187157]: 2025-12-03 00:05:34.655 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-1 ovn_controller[95464]: 2025-12-03T00:05:35Z|00142|binding|INFO|Claiming lport 3fc60c87-0094-403e-9fb0-564004da22b1 for this chassis.
Dec 03 00:05:35 compute-1 ovn_controller[95464]: 2025-12-03T00:05:35Z|00143|binding|INFO|3fc60c87-0094-403e-9fb0-564004da22b1: Claiming fa:16:3e:6d:ee:e6 10.100.0.11
Dec 03 00:05:35 compute-1 ovn_controller[95464]: 2025-12-03T00:05:35Z|00144|binding|INFO|Setting lport 3fc60c87-0094-403e-9fb0-564004da22b1 up in Southbound
Dec 03 00:05:35 compute-1 podman[197537]: time="2025-12-03T00:05:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:05:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:05:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18287 "" "Go-http-client/1.1"
Dec 03 00:05:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:05:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3071 "" "Go-http-client/1.1"
Dec 03 00:05:36 compute-1 nova_compute[187157]: 2025-12-03 00:05:36.051 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:36 compute-1 nova_compute[187157]: 2025-12-03 00:05:36.634 187161 INFO nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Post operation of migration started
Dec 03 00:05:36 compute-1 nova_compute[187157]: 2025-12-03 00:05:36.635 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.090 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.091 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.162 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.163 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.163 187161 DEBUG nova.network.neutron [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.522 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.669 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:37 compute-1 nova_compute[187157]: 2025-12-03 00:05:37.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:38 compute-1 nova_compute[187157]: 2025-12-03 00:05:38.371 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:38 compute-1 nova_compute[187157]: 2025-12-03 00:05:38.554 187161 DEBUG nova.network.neutron [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:05:39 compute-1 nova_compute[187157]: 2025-12-03 00:05:39.060 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:05:39 compute-1 podman[214195]: 2025-12-03 00:05:39.245163293 +0000 UTC m=+0.074503208 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:05:39 compute-1 nova_compute[187157]: 2025-12-03 00:05:39.575 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:39 compute-1 nova_compute[187157]: 2025-12-03 00:05:39.576 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:39 compute-1 nova_compute[187157]: 2025-12-03 00:05:39.576 187161 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:39 compute-1 nova_compute[187157]: 2025-12-03 00:05:39.580 187161 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:05:39 compute-1 virtqemud[186882]: Domain id=12 name='instance-0000000e' uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc is tainted: custom-monitor
Dec 03 00:05:40 compute-1 nova_compute[187157]: 2025-12-03 00:05:40.586 187161 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:05:40 compute-1 nova_compute[187157]: 2025-12-03 00:05:40.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.054 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.283 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.284 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.284 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.284 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.590 187161 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:05:41 compute-1 nova_compute[187157]: 2025-12-03 00:05:41.594 187161 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.104 187161 DEBUG nova.objects.instance [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.340 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.401 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.402 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.455 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.460 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.506 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.507 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.523 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.562 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.699 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.700 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.720 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.721 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5534MB free_disk=73.10858535766602GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.721 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:42 compute-1 nova_compute[187157]: 2025-12-03 00:05:42.721 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:43 compute-1 nova_compute[187157]: 2025-12-03 00:05:43.129 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:43 compute-1 nova_compute[187157]: 2025-12-03 00:05:43.492 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:43 compute-1 nova_compute[187157]: 2025-12-03 00:05:43.492 187161 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:43 compute-1 nova_compute[187157]: 2025-12-03 00:05:43.743 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Applying migration context for instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc as it has an incoming, in-progress migration ad96c46c-250d-4dee-aab8-996ce344a8d0. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Dec 03 00:05:43 compute-1 nova_compute[187157]: 2025-12-03 00:05:43.743 187161 DEBUG nova.objects.instance [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.250 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.299 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 6d03c9dd-e243-47ab-abb5-8a2a5387297d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.299 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.299 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.300 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:05:42 up  1:12,  0 user,  load average: 0.55, 0.39, 0.38\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_85e2f91a92cf4b5a9d626e8418f17322': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.370 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:05:44 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:44.626 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:44 compute-1 nova_compute[187157]: 2025-12-03 00:05:44.879 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:05:45 compute-1 podman[214233]: 2025-12-03 00:05:45.251880612 +0000 UTC m=+0.092836657 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:05:45 compute-1 nova_compute[187157]: 2025-12-03 00:05:45.388 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:05:45 compute-1 nova_compute[187157]: 2025-12-03 00:05:45.389 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.668s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:46 compute-1 nova_compute[187157]: 2025-12-03 00:05:46.056 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:46 compute-1 nova_compute[187157]: 2025-12-03 00:05:46.385 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:46 compute-1 nova_compute[187157]: 2025-12-03 00:05:46.386 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:46 compute-1 nova_compute[187157]: 2025-12-03 00:05:46.386 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:46 compute-1 nova_compute[187157]: 2025-12-03 00:05:46.386 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:05:46 compute-1 nova_compute[187157]: 2025-12-03 00:05:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:47 compute-1 podman[214259]: 2025-12-03 00:05:47.207947843 +0000 UTC m=+0.044313718 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.524 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.595 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.595 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.595 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.596 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.596 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.605 187161 INFO nova.compute.manager [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Terminating instance
Dec 03 00:05:47 compute-1 nova_compute[187157]: 2025-12-03 00:05:47.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.123 187161 DEBUG nova.compute.manager [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:05:48 compute-1 kernel: tap665e4d53-de (unregistering): left promiscuous mode
Dec 03 00:05:48 compute-1 NetworkManager[55553]: <info>  [1764720348.1492] device (tap665e4d53-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:05:48 compute-1 ovn_controller[95464]: 2025-12-03T00:05:48Z|00145|binding|INFO|Releasing lport 665e4d53-decd-4665-b2a5-879bf031819d from this chassis (sb_readonly=0)
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.156 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 ovn_controller[95464]: 2025-12-03T00:05:48Z|00146|binding|INFO|Setting lport 665e4d53-decd-4665-b2a5-879bf031819d down in Southbound
Dec 03 00:05:48 compute-1 ovn_controller[95464]: 2025-12-03T00:05:48Z|00147|binding|INFO|Removing iface tap665e4d53-de ovn-installed in OVS
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.158 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.164 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:bc:36 10.100.0.10'], port_security=['fa:16:3e:e6:bc:36 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6d03c9dd-e243-47ab-abb5-8a2a5387297d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=665e4d53-decd-4665-b2a5-879bf031819d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.165 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 665e4d53-decd-4665-b2a5-879bf031819d in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.165 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.169 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.178 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1a75ce5a-9c98-4c84-ba1e-e42cb22a6eb6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.200 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[809e806c-4aea-4c3a-954d-bd043aa6842e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.202 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f9dfcc-1b1b-4117-8e08-ee74c508cc11]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Dec 03 00:05:48 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 14.439s CPU time.
Dec 03 00:05:48 compute-1 systemd-machined[153454]: Machine qemu-11-instance-0000000f terminated.
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.225 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[633ccc73-fb2b-4450-901d-264035008779]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.238 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8d8f54d3-19e5-454c-aa54-4fc3fbd40a64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431661, 'reachable_time': 26201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214289, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.253 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d071ba60-55a5-49fa-bab6-50e4eda76db6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431671, 'tstamp': 431671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214290, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'taped11b71b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431674, 'tstamp': 431674}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214290, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.253 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.255 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.259 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.260 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.260 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.260 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.261 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:05:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:48.262 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[41a44161-d327-4dd5-a40e-8517795e8b78]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.376 187161 INFO nova.virt.libvirt.driver [-] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Instance destroyed successfully.
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.377 187161 DEBUG nova.objects.instance [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'resources' on Instance uuid 6d03c9dd-e243-47ab-abb5-8a2a5387297d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.559 187161 DEBUG nova.compute.manager [req-472d9b56-3cf6-4848-b3a6-5985a37171e1 req-3fded16b-be40-48a2-9ee5-d0897d0e4803 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-unplugged-665e4d53-decd-4665-b2a5-879bf031819d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.559 187161 DEBUG oslo_concurrency.lockutils [req-472d9b56-3cf6-4848-b3a6-5985a37171e1 req-3fded16b-be40-48a2-9ee5-d0897d0e4803 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.559 187161 DEBUG oslo_concurrency.lockutils [req-472d9b56-3cf6-4848-b3a6-5985a37171e1 req-3fded16b-be40-48a2-9ee5-d0897d0e4803 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.559 187161 DEBUG oslo_concurrency.lockutils [req-472d9b56-3cf6-4848-b3a6-5985a37171e1 req-3fded16b-be40-48a2-9ee5-d0897d0e4803 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.559 187161 DEBUG nova.compute.manager [req-472d9b56-3cf6-4848-b3a6-5985a37171e1 req-3fded16b-be40-48a2-9ee5-d0897d0e4803 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] No waiting events found dispatching network-vif-unplugged-665e4d53-decd-4665-b2a5-879bf031819d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.559 187161 DEBUG nova.compute.manager [req-472d9b56-3cf6-4848-b3a6-5985a37171e1 req-3fded16b-be40-48a2-9ee5-d0897d0e4803 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-unplugged-665e4d53-decd-4665-b2a5-879bf031819d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.887 187161 DEBUG nova.virt.libvirt.vif [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2014606199',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2014606199',id=15,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:04:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3rl8nbar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:04:58Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=6d03c9dd-e243-47ab-abb5-8a2a5387297d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.888 187161 DEBUG nova.network.os_vif_util [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "665e4d53-decd-4665-b2a5-879bf031819d", "address": "fa:16:3e:e6:bc:36", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap665e4d53-de", "ovs_interfaceid": "665e4d53-decd-4665-b2a5-879bf031819d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.889 187161 DEBUG nova.network.os_vif_util [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.889 187161 DEBUG os_vif [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.891 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.891 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap665e4d53-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.892 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.895 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.896 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.896 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=cdbb433a-e623-4d4f-b312-2fad77af1962) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.897 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.898 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.900 187161 INFO os_vif [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:bc:36,bridge_name='br-int',has_traffic_filtering=True,id=665e4d53-decd-4665-b2a5-879bf031819d,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap665e4d53-de')
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.900 187161 INFO nova.virt.libvirt.driver [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Deleting instance files /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d_del
Dec 03 00:05:48 compute-1 nova_compute[187157]: 2025-12-03 00:05:48.901 187161 INFO nova.virt.libvirt.driver [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Deletion of /var/lib/nova/instances/6d03c9dd-e243-47ab-abb5-8a2a5387297d_del complete
Dec 03 00:05:49 compute-1 nova_compute[187157]: 2025-12-03 00:05:49.413 187161 INFO nova.compute.manager [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:05:49 compute-1 nova_compute[187157]: 2025-12-03 00:05:49.414 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:05:49 compute-1 nova_compute[187157]: 2025-12-03 00:05:49.414 187161 DEBUG nova.compute.manager [-] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:05:49 compute-1 nova_compute[187157]: 2025-12-03 00:05:49.414 187161 DEBUG nova.network.neutron [-] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:05:49 compute-1 nova_compute[187157]: 2025-12-03 00:05:49.415 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: ERROR   00:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: ERROR   00:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: ERROR   00:05:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: ERROR   00:05:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: ERROR   00:05:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:05:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.520 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.678 187161 DEBUG nova.compute.manager [req-51aa0190-2cfd-47ef-a4a4-22503fd52942 req-7d8bc66d-e45f-4745-95db-a1189866f3cd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-unplugged-665e4d53-decd-4665-b2a5-879bf031819d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.678 187161 DEBUG oslo_concurrency.lockutils [req-51aa0190-2cfd-47ef-a4a4-22503fd52942 req-7d8bc66d-e45f-4745-95db-a1189866f3cd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.679 187161 DEBUG oslo_concurrency.lockutils [req-51aa0190-2cfd-47ef-a4a4-22503fd52942 req-7d8bc66d-e45f-4745-95db-a1189866f3cd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.679 187161 DEBUG oslo_concurrency.lockutils [req-51aa0190-2cfd-47ef-a4a4-22503fd52942 req-7d8bc66d-e45f-4745-95db-a1189866f3cd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.679 187161 DEBUG nova.compute.manager [req-51aa0190-2cfd-47ef-a4a4-22503fd52942 req-7d8bc66d-e45f-4745-95db-a1189866f3cd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] No waiting events found dispatching network-vif-unplugged-665e4d53-decd-4665-b2a5-879bf031819d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:50 compute-1 nova_compute[187157]: 2025-12-03 00:05:50.679 187161 DEBUG nova.compute.manager [req-51aa0190-2cfd-47ef-a4a4-22503fd52942 req-7d8bc66d-e45f-4745-95db-a1189866f3cd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-unplugged-665e4d53-decd-4665-b2a5-879bf031819d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:51 compute-1 nova_compute[187157]: 2025-12-03 00:05:51.478 187161 DEBUG nova.network.neutron [-] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:05:51 compute-1 nova_compute[187157]: 2025-12-03 00:05:51.989 187161 INFO nova.compute.manager [-] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Took 2.57 seconds to deallocate network for instance.
Dec 03 00:05:52 compute-1 nova_compute[187157]: 2025-12-03 00:05:52.509 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:52 compute-1 nova_compute[187157]: 2025-12-03 00:05:52.510 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:52 compute-1 nova_compute[187157]: 2025-12-03 00:05:52.526 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:52 compute-1 nova_compute[187157]: 2025-12-03 00:05:52.562 187161 DEBUG nova.compute.provider_tree [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:05:52 compute-1 nova_compute[187157]: 2025-12-03 00:05:52.748 187161 DEBUG nova.compute.manager [req-ccd11471-6687-4549-8fd6-8a5c3f943f8b req-7936e94d-293f-42d1-9af4-2e261c8c3348 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 6d03c9dd-e243-47ab-abb5-8a2a5387297d] Received event network-vif-deleted-665e4d53-decd-4665-b2a5-879bf031819d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:53 compute-1 nova_compute[187157]: 2025-12-03 00:05:53.068 187161 DEBUG nova.scheduler.client.report [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:05:53 compute-1 nova_compute[187157]: 2025-12-03 00:05:53.580 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:53 compute-1 nova_compute[187157]: 2025-12-03 00:05:53.601 187161 INFO nova.scheduler.client.report [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Deleted allocations for instance 6d03c9dd-e243-47ab-abb5-8a2a5387297d
Dec 03 00:05:53 compute-1 nova_compute[187157]: 2025-12-03 00:05:53.897 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:54 compute-1 nova_compute[187157]: 2025-12-03 00:05:54.630 187161 DEBUG oslo_concurrency.lockutils [None req-f40ec1b0-9e7c-4566-9be8-fe43baa5e7e5 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "6d03c9dd-e243-47ab-abb5-8a2a5387297d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.035s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.337 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.337 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.337 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.338 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.338 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.351 187161 INFO nova.compute.manager [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Terminating instance
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.866 187161 DEBUG nova.compute.manager [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:05:55 compute-1 kernel: tap3fc60c87-00 (unregistering): left promiscuous mode
Dec 03 00:05:55 compute-1 NetworkManager[55553]: <info>  [1764720355.9032] device (tap3fc60c87-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.913 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:55 compute-1 ovn_controller[95464]: 2025-12-03T00:05:55Z|00148|binding|INFO|Releasing lport 3fc60c87-0094-403e-9fb0-564004da22b1 from this chassis (sb_readonly=0)
Dec 03 00:05:55 compute-1 ovn_controller[95464]: 2025-12-03T00:05:55Z|00149|binding|INFO|Setting lport 3fc60c87-0094-403e-9fb0-564004da22b1 down in Southbound
Dec 03 00:05:55 compute-1 ovn_controller[95464]: 2025-12-03T00:05:55Z|00150|binding|INFO|Removing iface tap3fc60c87-00 ovn-installed in OVS
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.915 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:55.927 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:ee:e6 10.100.0.11'], port_security=['fa:16:3e:6d:ee:e6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5d86e858-6a62-411e-a8dc-dffcfa247bfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '15', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=3fc60c87-0094-403e-9fb0-564004da22b1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:05:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:55.928 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 3fc60c87-0094-403e-9fb0-564004da22b1 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:05:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:55.929 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed11b71b-745b-4f0c-9f09-37d53d166bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:05:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:55.930 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8250e0f5-f932-480b-9339-0dce41ad95f0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:55.930 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace which is not needed anymore
Dec 03 00:05:55 compute-1 nova_compute[187157]: 2025-12-03 00:05:55.945 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:55 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec 03 00:05:55 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Consumed 1.931s CPU time.
Dec 03 00:05:55 compute-1 systemd-machined[153454]: Machine qemu-12-instance-0000000e terminated.
Dec 03 00:05:56 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [NOTICE]   (213957) : haproxy version is 3.0.5-8e879a5
Dec 03 00:05:56 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [NOTICE]   (213957) : path to executable is /usr/sbin/haproxy
Dec 03 00:05:56 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [WARNING]  (213957) : Exiting Master process...
Dec 03 00:05:56 compute-1 podman[214339]: 2025-12-03 00:05:56.064418881 +0000 UTC m=+0.035375384 container kill 28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 03 00:05:56 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [ALERT]    (213957) : Current worker (213960) exited with code 143 (Terminated)
Dec 03 00:05:56 compute-1 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[213945]: [WARNING]  (213957) : All workers exited. Exiting... (0)
Dec 03 00:05:56 compute-1 systemd[1]: libpod-28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f.scope: Deactivated successfully.
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.087 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.097 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.101 187161 DEBUG nova.compute.manager [req-5c6a4802-4e61-40f1-bdc8-6d7493568a40 req-3968f3bf-5500-40a0-9e1b-170a76cbe228 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.101 187161 DEBUG oslo_concurrency.lockutils [req-5c6a4802-4e61-40f1-bdc8-6d7493568a40 req-3968f3bf-5500-40a0-9e1b-170a76cbe228 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.102 187161 DEBUG oslo_concurrency.lockutils [req-5c6a4802-4e61-40f1-bdc8-6d7493568a40 req-3968f3bf-5500-40a0-9e1b-170a76cbe228 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.103 187161 DEBUG oslo_concurrency.lockutils [req-5c6a4802-4e61-40f1-bdc8-6d7493568a40 req-3968f3bf-5500-40a0-9e1b-170a76cbe228 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.103 187161 DEBUG nova.compute.manager [req-5c6a4802-4e61-40f1-bdc8-6d7493568a40 req-3968f3bf-5500-40a0-9e1b-170a76cbe228 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.103 187161 DEBUG nova.compute.manager [req-5c6a4802-4e61-40f1-bdc8-6d7493568a40 req-3968f3bf-5500-40a0-9e1b-170a76cbe228 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:56 compute-1 podman[214358]: 2025-12-03 00:05:56.126950293 +0000 UTC m=+0.034025473 container died 28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.138 187161 INFO nova.virt.libvirt.driver [-] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance destroyed successfully.
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.139 187161 DEBUG nova.objects.instance [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'resources' on Instance uuid 5d86e858-6a62-411e-a8dc-dffcfa247bfc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:05:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f-userdata-shm.mount: Deactivated successfully.
Dec 03 00:05:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-d25f39ef4c66ce9ed57fe02963525cdc3213770c3a55a0ad4d9b72831cfb6f16-merged.mount: Deactivated successfully.
Dec 03 00:05:56 compute-1 podman[214358]: 2025-12-03 00:05:56.171012574 +0000 UTC m=+0.078087744 container remove 28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Dec 03 00:05:56 compute-1 systemd[1]: libpod-conmon-28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f.scope: Deactivated successfully.
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.179 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ce47e6a4-1e85-42e5-b05e-3c20aa11ba49]: (4, ("Wed Dec  3 12:05:56 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f)\n28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f\nWed Dec  3 12:05:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f)\n28ef8ee99cee1243bcba3f46243723338232299b4b832adb7f68499eee6e683f\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.181 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ff685503-c2eb-4b0c-a5f8-4920e0364d37]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.181 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.182 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[eaef1f20-d801-4163-a2da-37df41768b6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.183 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.186 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 kernel: taped11b71b-70: left promiscuous mode
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.214 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.216 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e1235a-ffd7-4caa-80f1-e31d2fbb1d46]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.232 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c7de2ae8-328d-465f-8d30-0a0637a222dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.234 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[61df08d0-efa4-4e52-ac28-6d1f02f52135]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.251 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[af7274f1-4d9f-453c-952b-b7d87caf64e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431655, 'reachable_time': 35461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214402, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.253 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:05:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:05:56.253 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[73eda849-9e2b-4412-8063-ba7d530ba26e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:56 compute-1 systemd[1]: run-netns-ovnmeta\x2ded11b71b\x2d745b\x2d4f0c\x2d9f09\x2d37d53d166bcb.mount: Deactivated successfully.
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.647 187161 DEBUG nova.virt.libvirt.vif [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:04:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1547033723',id=14,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:04:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-rbawllbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:05:42Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.648 187161 DEBUG nova.network.os_vif_util [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.648 187161 DEBUG nova.network.os_vif_util [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.649 187161 DEBUG os_vif [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.650 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.650 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3fc60c87-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.651 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.652 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.653 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.653 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=18e9b95a-621e-4ef0-ac8e-b03b0161d32a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.654 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.655 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.657 187161 INFO os_vif [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00')
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.658 187161 INFO nova.virt.libvirt.driver [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Deleting instance files /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc_del
Dec 03 00:05:56 compute-1 nova_compute[187157]: 2025-12-03 00:05:56.658 187161 INFO nova.virt.libvirt.driver [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Deletion of /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc_del complete
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.169 187161 INFO nova.compute.manager [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Took 1.30 seconds to destroy the instance on the hypervisor.
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.170 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.171 187161 DEBUG nova.compute.manager [-] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.171 187161 DEBUG nova.network.neutron [-] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.171 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.485 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:57 compute-1 nova_compute[187157]: 2025-12-03 00:05:57.528 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.155 187161 DEBUG nova.compute.manager [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.156 187161 DEBUG oslo_concurrency.lockutils [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.157 187161 DEBUG oslo_concurrency.lockutils [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.157 187161 DEBUG oslo_concurrency.lockutils [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.158 187161 DEBUG nova.compute.manager [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.158 187161 DEBUG nova.compute.manager [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.158 187161 DEBUG nova.compute.manager [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-deleted-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.159 187161 INFO nova.compute.manager [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Neutron deleted interface 3fc60c87-0094-403e-9fb0-564004da22b1; detaching it from the instance and deleting it from the info cache
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.159 187161 DEBUG nova.network.neutron [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.253 187161 DEBUG nova.network.neutron [-] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.670 187161 DEBUG nova.compute.manager [req-6eb284f7-d381-4b5a-95e8-f93c9948b207 req-10b7e14f-e543-4df0-9159-25cb6bd83b19 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Detach interface failed, port_id=3fc60c87-0094-403e-9fb0-564004da22b1, reason: Instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:05:58 compute-1 nova_compute[187157]: 2025-12-03 00:05:58.762 187161 INFO nova.compute.manager [-] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Took 1.59 seconds to deallocate network for instance.
Dec 03 00:05:59 compute-1 podman[214403]: 2025-12-03 00:05:59.266878295 +0000 UTC m=+0.083555824 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6, architecture=x86_64, managed_by=edpm_ansible)
Dec 03 00:05:59 compute-1 nova_compute[187157]: 2025-12-03 00:05:59.306 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:59 compute-1 nova_compute[187157]: 2025-12-03 00:05:59.306 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:59 compute-1 nova_compute[187157]: 2025-12-03 00:05:59.370 187161 DEBUG nova.compute.provider_tree [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:05:59 compute-1 nova_compute[187157]: 2025-12-03 00:05:59.880 187161 DEBUG nova.scheduler.client.report [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:06:00 compute-1 nova_compute[187157]: 2025-12-03 00:06:00.394 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:00 compute-1 nova_compute[187157]: 2025-12-03 00:06:00.420 187161 INFO nova.scheduler.client.report [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Deleted allocations for instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc
Dec 03 00:06:01 compute-1 nova_compute[187157]: 2025-12-03 00:06:01.452 187161 DEBUG oslo_concurrency.lockutils [None req-2cf75fc5-06e4-4ec0-8f3f-8061e9ff9cfa ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:01 compute-1 nova_compute[187157]: 2025-12-03 00:06:01.655 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:01.723 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:01.724 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:01.724 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:02 compute-1 podman[214428]: 2025-12-03 00:06:02.223521835 +0000 UTC m=+0.055232229 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:06:02 compute-1 nova_compute[187157]: 2025-12-03 00:06:02.564 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:05 compute-1 podman[197537]: time="2025-12-03T00:06:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:06:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:06:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:06:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:06:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2610 "" "Go-http-client/1.1"
Dec 03 00:06:06 compute-1 nova_compute[187157]: 2025-12-03 00:06:06.656 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:07 compute-1 nova_compute[187157]: 2025-12-03 00:06:07.120 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:07 compute-1 nova_compute[187157]: 2025-12-03 00:06:07.566 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:10 compute-1 podman[214449]: 2025-12-03 00:06:10.208207027 +0000 UTC m=+0.050047504 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:06:11 compute-1 nova_compute[187157]: 2025-12-03 00:06:11.658 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:12 compute-1 nova_compute[187157]: 2025-12-03 00:06:12.567 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:16 compute-1 podman[214472]: 2025-12-03 00:06:16.227445234 +0000 UTC m=+0.074961169 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:06:16 compute-1 nova_compute[187157]: 2025-12-03 00:06:16.659 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:17 compute-1 nova_compute[187157]: 2025-12-03 00:06:17.568 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:18 compute-1 podman[214498]: 2025-12-03 00:06:18.209609978 +0000 UTC m=+0.050409223 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: ERROR   00:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: ERROR   00:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: ERROR   00:06:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: ERROR   00:06:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: ERROR   00:06:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:06:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:06:19 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:19.575 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:f8:06 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b356b9112e0c4e6083f56fc1c7796972', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=df9da247-f3c2-412c-95a4-9a2562c93dd4) old=Port_Binding(mac=['fa:16:3e:04:f8:06'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b356b9112e0c4e6083f56fc1c7796972', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:19 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:19.576 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port df9da247-f3c2-412c-95a4-9a2562c93dd4 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 updated
Dec 03 00:06:19 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:19.576 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:06:19 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:19.577 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bc59c708-614b-4b8d-8767-78112956a490]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:21 compute-1 nova_compute[187157]: 2025-12-03 00:06:21.713 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:22 compute-1 nova_compute[187157]: 2025-12-03 00:06:22.571 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:26 compute-1 nova_compute[187157]: 2025-12-03 00:06:26.714 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:27 compute-1 nova_compute[187157]: 2025-12-03 00:06:27.572 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:28 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:28.801 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:17:82 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cd602f0-6d27-4f32-958a-fa46ec296bd3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=94de005e-3a8b-4f10-829a-627ec3895f56) old=Port_Binding(mac=['fa:16:3e:36:17:82'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:28 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:28.802 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 94de005e-3a8b-4f10-829a-627ec3895f56 in datapath 633013f1-c17e-45b0-841b-2c82c9dddeea updated
Dec 03 00:06:28 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:28.803 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 633013f1-c17e-45b0-841b-2c82c9dddeea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:06:28 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:28.803 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e360cdce-1a5d-4185-b9e6-21d02ac261c9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:30 compute-1 podman[214516]: 2025-12-03 00:06:30.214731526 +0000 UTC m=+0.055321751 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Dec 03 00:06:31 compute-1 nova_compute[187157]: 2025-12-03 00:06:31.716 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:32 compute-1 nova_compute[187157]: 2025-12-03 00:06:32.574 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:33 compute-1 podman[214539]: 2025-12-03 00:06:33.219039302 +0000 UTC m=+0.064077849 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 00:06:33 compute-1 nova_compute[187157]: 2025-12-03 00:06:33.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:34 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:34.992 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:34 compute-1 nova_compute[187157]: 2025-12-03 00:06:34.992 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:34 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:34.994 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:06:35 compute-1 podman[197537]: time="2025-12-03T00:06:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:06:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:06:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:06:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:06:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2616 "" "Go-http-client/1.1"
Dec 03 00:06:36 compute-1 nova_compute[187157]: 2025-12-03 00:06:36.718 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:37 compute-1 nova_compute[187157]: 2025-12-03 00:06:37.574 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:37 compute-1 sshd-session[214560]: Received disconnect from 193.46.255.244 port 62918:11:  [preauth]
Dec 03 00:06:37 compute-1 sshd-session[214560]: Disconnected from authenticating user root 193.46.255.244 port 62918 [preauth]
Dec 03 00:06:39 compute-1 nova_compute[187157]: 2025-12-03 00:06:39.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:40 compute-1 nova_compute[187157]: 2025-12-03 00:06:40.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:40 compute-1 ovn_controller[95464]: 2025-12-03T00:06:40Z|00151|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 03 00:06:41 compute-1 podman[214562]: 2025-12-03 00:06:41.212826081 +0000 UTC m=+0.050631220 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.342 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.343 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.358 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.358 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5845MB free_disk=73.16625213623047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.359 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.359 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:41 compute-1 nova_compute[187157]: 2025-12-03 00:06:41.720 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:42 compute-1 nova_compute[187157]: 2025-12-03 00:06:42.399 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:06:42 compute-1 nova_compute[187157]: 2025-12-03 00:06:42.400 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:06:41 up  1:13,  0 user,  load average: 0.50, 0.40, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:06:42 compute-1 nova_compute[187157]: 2025-12-03 00:06:42.413 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:06:42 compute-1 nova_compute[187157]: 2025-12-03 00:06:42.577 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:42 compute-1 nova_compute[187157]: 2025-12-03 00:06:42.920 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:06:43 compute-1 nova_compute[187157]: 2025-12-03 00:06:43.429 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:06:43 compute-1 nova_compute[187157]: 2025-12-03 00:06:43.429 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:44 compute-1 nova_compute[187157]: 2025-12-03 00:06:44.430 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:44 compute-1 nova_compute[187157]: 2025-12-03 00:06:44.430 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:44 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:06:44.995 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:46 compute-1 nova_compute[187157]: 2025-12-03 00:06:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:46 compute-1 nova_compute[187157]: 2025-12-03 00:06:46.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:06:46 compute-1 nova_compute[187157]: 2025-12-03 00:06:46.722 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:47 compute-1 podman[214589]: 2025-12-03 00:06:47.233400979 +0000 UTC m=+0.079815044 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Dec 03 00:06:47 compute-1 nova_compute[187157]: 2025-12-03 00:06:47.579 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:48 compute-1 nova_compute[187157]: 2025-12-03 00:06:48.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:49 compute-1 podman[214616]: 2025-12-03 00:06:49.19830951 +0000 UTC m=+0.038751735 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: ERROR   00:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: ERROR   00:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: ERROR   00:06:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: ERROR   00:06:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: ERROR   00:06:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:06:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:06:49 compute-1 nova_compute[187157]: 2025-12-03 00:06:49.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:51 compute-1 nova_compute[187157]: 2025-12-03 00:06:51.723 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:52 compute-1 nova_compute[187157]: 2025-12-03 00:06:52.582 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:53 compute-1 nova_compute[187157]: 2025-12-03 00:06:53.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:56 compute-1 nova_compute[187157]: 2025-12-03 00:06:56.726 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:57 compute-1 nova_compute[187157]: 2025-12-03 00:06:57.583 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:59 compute-1 nova_compute[187157]: 2025-12-03 00:06:59.708 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:59 compute-1 nova_compute[187157]: 2025-12-03 00:06:59.709 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:00 compute-1 nova_compute[187157]: 2025-12-03 00:07:00.216 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:07:00 compute-1 nova_compute[187157]: 2025-12-03 00:07:00.771 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:00 compute-1 nova_compute[187157]: 2025-12-03 00:07:00.771 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:00 compute-1 nova_compute[187157]: 2025-12-03 00:07:00.777 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:07:00 compute-1 nova_compute[187157]: 2025-12-03 00:07:00.777 187161 INFO nova.compute.claims [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:07:01 compute-1 podman[214635]: 2025-12-03 00:07:01.266770229 +0000 UTC m=+0.102332742 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:07:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:01.725 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:01.725 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:01.725 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:01 compute-1 nova_compute[187157]: 2025-12-03 00:07:01.728 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:01 compute-1 nova_compute[187157]: 2025-12-03 00:07:01.838 187161 DEBUG nova.compute.provider_tree [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:07:02 compute-1 nova_compute[187157]: 2025-12-03 00:07:02.345 187161 DEBUG nova.scheduler.client.report [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:07:02 compute-1 nova_compute[187157]: 2025-12-03 00:07:02.585 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:02 compute-1 nova_compute[187157]: 2025-12-03 00:07:02.856 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:02 compute-1 nova_compute[187157]: 2025-12-03 00:07:02.857 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:07:03 compute-1 nova_compute[187157]: 2025-12-03 00:07:03.366 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:07:03 compute-1 nova_compute[187157]: 2025-12-03 00:07:03.366 187161 DEBUG nova.network.neutron [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:07:03 compute-1 nova_compute[187157]: 2025-12-03 00:07:03.367 187161 WARNING neutronclient.v2_0.client [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:03 compute-1 nova_compute[187157]: 2025-12-03 00:07:03.367 187161 WARNING neutronclient.v2_0.client [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:03 compute-1 nova_compute[187157]: 2025-12-03 00:07:03.874 187161 INFO nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:07:04 compute-1 podman[214657]: 2025-12-03 00:07:04.218358629 +0000 UTC m=+0.057434322 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.336 187161 DEBUG nova.network.neutron [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Successfully created port: f32103d7-9e67-4711-8680-5684cc43d30e _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.383 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.895 187161 DEBUG nova.network.neutron [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Successfully updated port: f32103d7-9e67-4711-8680-5684cc43d30e _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.951 187161 DEBUG nova.compute.manager [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-changed-f32103d7-9e67-4711-8680-5684cc43d30e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.951 187161 DEBUG nova.compute.manager [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Refreshing instance network info cache due to event network-changed-f32103d7-9e67-4711-8680-5684cc43d30e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.951 187161 DEBUG oslo_concurrency.lockutils [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-7245dda2-9732-4cc5-acfc-c277bdea6b4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.952 187161 DEBUG oslo_concurrency.lockutils [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-7245dda2-9732-4cc5-acfc-c277bdea6b4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:07:04 compute-1 nova_compute[187157]: 2025-12-03 00:07:04.952 187161 DEBUG nova.network.neutron [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Refreshing network info cache for port f32103d7-9e67-4711-8680-5684cc43d30e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.398 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.400 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.401 187161 INFO nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Creating image(s)
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.402 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "/var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.402 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.404 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.406 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.412 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.414 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "refresh_cache-7245dda2-9732-4cc5-acfc-c277bdea6b4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.415 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.457 187161 WARNING neutronclient.v2_0.client [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.491 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.492 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.492 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.493 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.497 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.497 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.558 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.559 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.589 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.589 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.590 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.637 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.638 187161 DEBUG nova.virt.disk.api [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Checking if we can resize image /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.638 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:05 compute-1 podman[197537]: time="2025-12-03T00:07:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:07:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:07:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:07:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:07:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2612 "" "Go-http-client/1.1"
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.686 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.686 187161 DEBUG nova.virt.disk.api [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Cannot resize image /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.687 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.687 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Ensure instance console log exists: /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.688 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.688 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:05 compute-1 nova_compute[187157]: 2025-12-03 00:07:05.688 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:06 compute-1 nova_compute[187157]: 2025-12-03 00:07:06.507 187161 DEBUG nova.network.neutron [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:07:06 compute-1 nova_compute[187157]: 2025-12-03 00:07:06.730 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:07 compute-1 nova_compute[187157]: 2025-12-03 00:07:07.558 187161 DEBUG nova.network.neutron [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:07:07 compute-1 nova_compute[187157]: 2025-12-03 00:07:07.587 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:08 compute-1 nova_compute[187157]: 2025-12-03 00:07:08.065 187161 DEBUG oslo_concurrency.lockutils [req-dc08d1de-10d5-4911-9311-569792bf410f req-5ce7a5af-1f73-4c15-9808-efb20116f716 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-7245dda2-9732-4cc5-acfc-c277bdea6b4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:07:08 compute-1 nova_compute[187157]: 2025-12-03 00:07:08.066 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquired lock "refresh_cache-7245dda2-9732-4cc5-acfc-c277bdea6b4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:07:08 compute-1 nova_compute[187157]: 2025-12-03 00:07:08.066 187161 DEBUG nova.network.neutron [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:07:08 compute-1 nova_compute[187157]: 2025-12-03 00:07:08.779 187161 DEBUG nova.network.neutron [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:07:08 compute-1 nova_compute[187157]: 2025-12-03 00:07:08.989 187161 WARNING neutronclient.v2_0.client [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.144 187161 DEBUG nova.network.neutron [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Updating instance_info_cache with network_info: [{"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.650 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Releasing lock "refresh_cache-7245dda2-9732-4cc5-acfc-c277bdea6b4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.651 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Instance network_info: |[{"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.653 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Start _get_guest_xml network_info=[{"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.656 187161 WARNING nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.657 187161 DEBUG nova.virt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-528950227', uuid='7245dda2-9732-4cc5-acfc-c277bdea6b4f'), owner=OwnerMeta(userid='d7f72082c96e4f868d5b158a57237cee', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin', projectid='869170c9b0864bd8a0f2258e90e55a84', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720429.6577573) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.709 187161 DEBUG nova.virt.libvirt.host [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.710 187161 DEBUG nova.virt.libvirt.host [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.746 187161 DEBUG nova.virt.libvirt.host [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.747 187161 DEBUG nova.virt.libvirt.host [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.748 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.748 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.748 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.748 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.749 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.749 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.749 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.749 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.749 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.750 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.750 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.750 187161 DEBUG nova.virt.hardware [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.753 187161 DEBUG nova.virt.libvirt.vif [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:06:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-528950227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-528',id=17,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-rsnmfdif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:07:04Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=7245dda2-9732-4cc5-acfc-c277bdea6b4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.754 187161 DEBUG nova.network.os_vif_util [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.754 187161 DEBUG nova.network.os_vif_util [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:07:09 compute-1 nova_compute[187157]: 2025-12-03 00:07:09.755 187161 DEBUG nova.objects.instance [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7245dda2-9732-4cc5-acfc-c277bdea6b4f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.273 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <uuid>7245dda2-9732-4cc5-acfc-c277bdea6b4f</uuid>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <name>instance-00000011</name>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-528950227</nova:name>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:07:09</nova:creationTime>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:07:10 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:07:10 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         <nova:port uuid="f32103d7-9e67-4711-8680-5684cc43d30e">
Dec 03 00:07:10 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <system>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <entry name="serial">7245dda2-9732-4cc5-acfc-c277bdea6b4f</entry>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <entry name="uuid">7245dda2-9732-4cc5-acfc-c277bdea6b4f</entry>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </system>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <os>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </os>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <features>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </features>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.config"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:98:63:08"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <target dev="tapf32103d7-9e"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/console.log" append="off"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <video>
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </video>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:07:10 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:07:10 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:07:10 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:07:10 compute-1 nova_compute[187157]: </domain>
Dec 03 00:07:10 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.274 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Preparing to wait for external event network-vif-plugged-f32103d7-9e67-4711-8680-5684cc43d30e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.274 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.275 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.275 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.276 187161 DEBUG nova.virt.libvirt.vif [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:06:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-528950227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-528',id=17,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-rsnmfdif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:07:04Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=7245dda2-9732-4cc5-acfc-c277bdea6b4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.276 187161 DEBUG nova.network.os_vif_util [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.276 187161 DEBUG nova.network.os_vif_util [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.277 187161 DEBUG os_vif [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.277 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.278 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.278 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.278 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.279 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '05fbdf03-9939-5f14-b054-290e5796bc36', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.280 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.281 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.282 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.283 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.284 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf32103d7-9e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.284 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf32103d7-9e, col_values=(('qos', UUID('12b22bd3-2110-4e58-9d45-4de5bc510cb2')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.284 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf32103d7-9e, col_values=(('external_ids', {'iface-id': 'f32103d7-9e67-4711-8680-5684cc43d30e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:63:08', 'vm-uuid': '7245dda2-9732-4cc5-acfc-c277bdea6b4f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.285 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 NetworkManager[55553]: <info>  [1764720430.2871] manager: (tapf32103d7-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.287 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.293 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:10 compute-1 nova_compute[187157]: 2025-12-03 00:07:10.294 187161 INFO os_vif [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e')
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.030 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.031 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.031 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No VIF found with MAC fa:16:3e:98:63:08, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.031 187161 INFO nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Using config drive
Dec 03 00:07:12 compute-1 podman[214695]: 2025-12-03 00:07:12.205192773 +0000 UTC m=+0.044216307 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.540 187161 WARNING neutronclient.v2_0.client [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.588 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.699 187161 INFO nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Creating config drive at /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.config
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.703 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp9zhjajo0 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.826 187161 DEBUG oslo_concurrency.processutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp9zhjajo0" returned: 0 in 0.123s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:12 compute-1 kernel: tapf32103d7-9e: entered promiscuous mode
Dec 03 00:07:12 compute-1 NetworkManager[55553]: <info>  [1764720432.8802] manager: (tapf32103d7-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Dec 03 00:07:12 compute-1 ovn_controller[95464]: 2025-12-03T00:07:12Z|00152|binding|INFO|Claiming lport f32103d7-9e67-4711-8680-5684cc43d30e for this chassis.
Dec 03 00:07:12 compute-1 ovn_controller[95464]: 2025-12-03T00:07:12Z|00153|binding|INFO|f32103d7-9e67-4711-8680-5684cc43d30e: Claiming fa:16:3e:98:63:08 10.100.0.11
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.878 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.882 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.886 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.895 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:63:08 10.100.0.11'], port_security=['fa:16:3e:98:63:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7245dda2-9732-4cc5-acfc-c277bdea6b4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=f32103d7-9e67-4711-8680-5684cc43d30e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.896 104348 INFO neutron.agent.ovn.metadata.agent [-] Port f32103d7-9e67-4711-8680-5684cc43d30e in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 bound to our chassis
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.897 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:07:12 compute-1 systemd-udevd[214740]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.906 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[17b80ad2-d3ba-44fc-802d-91cb5b163dcc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.907 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c6ad8f4-61 in ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.908 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c6ad8f4-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.909 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[142c276f-b3c1-47ae-9d44-ac61bb90ba05]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 systemd-machined[153454]: New machine qemu-13-instance-00000011.
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.909 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4b93ab-2eb5-4a48-ba03-2a7f300b9e4f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 NetworkManager[55553]: <info>  [1764720432.9179] device (tapf32103d7-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:07:12 compute-1 NetworkManager[55553]: <info>  [1764720432.9185] device (tapf32103d7-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.918 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[baade612-8f81-4bb2-b635-1bdb4adb4ca0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.938 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.939 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[79d7d8b1-bdf5-44fe-8249-5fe9dbfd7e35]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 ovn_controller[95464]: 2025-12-03T00:07:12Z|00154|binding|INFO|Setting lport f32103d7-9e67-4711-8680-5684cc43d30e ovn-installed in OVS
Dec 03 00:07:12 compute-1 ovn_controller[95464]: 2025-12-03T00:07:12Z|00155|binding|INFO|Setting lport f32103d7-9e67-4711-8680-5684cc43d30e up in Southbound
Dec 03 00:07:12 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Dec 03 00:07:12 compute-1 nova_compute[187157]: 2025-12-03 00:07:12.942 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.964 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1f94ba-cb80-4293-8285-9f29ce6ffe7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:12.970 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1fcff979-9585-4e3b-a12a-3c8d04eeae47]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:12 compute-1 NetworkManager[55553]: <info>  [1764720432.9710] manager: (tap9c6ad8f4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.001 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf0ff34-4d98-4f40-a56e-11107ee01ed7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.004 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[54e3ca52-871f-4b03-9b57-9a1713120277]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 NetworkManager[55553]: <info>  [1764720433.0234] device (tap9c6ad8f4-60): carrier: link connected
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.026 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[44ee22f7-d2bf-48a1-90b3-bf6330cb33a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.038 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a01db093-7ca9-4b0f-a150-1c378a60e488]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445267, 'reachable_time': 33883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214772, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.048 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[db87b9a2-3d84-4146-8ef8-45a133da6edd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:f806'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445267, 'tstamp': 445267}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214773, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.058 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd308a5-4674-45ab-a536-1744fb26a404]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445267, 'reachable_time': 33883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214774, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.078 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3b5db5-d60c-463b-9a4f-5d76267496d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.126 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[24388603-ae12-4126-9ced-7de253f38ae9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.127 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.127 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.128 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:13 compute-1 kernel: tap9c6ad8f4-60: entered promiscuous mode
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.168 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:13 compute-1 NetworkManager[55553]: <info>  [1764720433.1720] manager: (tap9c6ad8f4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.174 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.173 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:13 compute-1 ovn_controller[95464]: 2025-12-03T00:07:13Z|00156|binding|INFO|Releasing lport df9da247-f3c2-412c-95a4-9a2562c93dd4 from this chassis (sb_readonly=0)
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.187 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.187 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a47514-a849-4288-aebf-89cc9518a58d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.188 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.188 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.188 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.188 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.188 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.189 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbe4e0c-2616-415a-a0e5-5b6425434984]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.189 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.189 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1231fa-bf6e-4a1b-9385-6ea8d90fbc6c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.190 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:07:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:13.190 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'env', 'PROCESS_TAG=haproxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:07:13 compute-1 podman[214813]: 2025-12-03 00:07:13.532016263 +0000 UTC m=+0.043442068 container create b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:07:13 compute-1 systemd[1]: Started libpod-conmon-b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612.scope.
Dec 03 00:07:13 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:07:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b42e476cb975dea83bc4655999d75c5acf006989e90c7537ebb8bc26effb602a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:07:13 compute-1 podman[214813]: 2025-12-03 00:07:13.509946857 +0000 UTC m=+0.021372672 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:07:13 compute-1 podman[214813]: 2025-12-03 00:07:13.606170532 +0000 UTC m=+0.117596397 container init b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.4)
Dec 03 00:07:13 compute-1 podman[214813]: 2025-12-03 00:07:13.611385336 +0000 UTC m=+0.122811151 container start b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.4)
Dec 03 00:07:13 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [NOTICE]   (214832) : New worker (214834) forked
Dec 03 00:07:13 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [NOTICE]   (214832) : Loading success.
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.636 187161 DEBUG nova.compute.manager [req-4513c0fb-196c-458e-91cd-56f782decbcd req-c4d2348e-e131-4180-ad46-c1de83829103 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-plugged-f32103d7-9e67-4711-8680-5684cc43d30e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.636 187161 DEBUG oslo_concurrency.lockutils [req-4513c0fb-196c-458e-91cd-56f782decbcd req-c4d2348e-e131-4180-ad46-c1de83829103 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.637 187161 DEBUG oslo_concurrency.lockutils [req-4513c0fb-196c-458e-91cd-56f782decbcd req-c4d2348e-e131-4180-ad46-c1de83829103 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.637 187161 DEBUG oslo_concurrency.lockutils [req-4513c0fb-196c-458e-91cd-56f782decbcd req-c4d2348e-e131-4180-ad46-c1de83829103 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.637 187161 DEBUG nova.compute.manager [req-4513c0fb-196c-458e-91cd-56f782decbcd req-c4d2348e-e131-4180-ad46-c1de83829103 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Processing event network-vif-plugged-f32103d7-9e67-4711-8680-5684cc43d30e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.638 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.642 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.649 187161 INFO nova.virt.libvirt.driver [-] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Instance spawned successfully.
Dec 03 00:07:13 compute-1 nova_compute[187157]: 2025-12-03 00:07:13.650 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.165 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.166 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.167 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.168 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.168 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.169 187161 DEBUG nova.virt.libvirt.driver [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.686 187161 INFO nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Took 9.29 seconds to spawn the instance on the hypervisor.
Dec 03 00:07:14 compute-1 nova_compute[187157]: 2025-12-03 00:07:14.686 187161 DEBUG nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.220 187161 INFO nova.compute.manager [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Took 14.49 seconds to build instance.
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.285 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.700 187161 DEBUG nova.compute.manager [req-1743d0b7-d81e-4472-8314-802bab187e9d req-c5c665a1-dd47-47e0-968d-6df8b0c68d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-plugged-f32103d7-9e67-4711-8680-5684cc43d30e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.700 187161 DEBUG oslo_concurrency.lockutils [req-1743d0b7-d81e-4472-8314-802bab187e9d req-c5c665a1-dd47-47e0-968d-6df8b0c68d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.700 187161 DEBUG oslo_concurrency.lockutils [req-1743d0b7-d81e-4472-8314-802bab187e9d req-c5c665a1-dd47-47e0-968d-6df8b0c68d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.701 187161 DEBUG oslo_concurrency.lockutils [req-1743d0b7-d81e-4472-8314-802bab187e9d req-c5c665a1-dd47-47e0-968d-6df8b0c68d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.701 187161 DEBUG nova.compute.manager [req-1743d0b7-d81e-4472-8314-802bab187e9d req-c5c665a1-dd47-47e0-968d-6df8b0c68d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] No waiting events found dispatching network-vif-plugged-f32103d7-9e67-4711-8680-5684cc43d30e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.701 187161 WARNING nova.compute.manager [req-1743d0b7-d81e-4472-8314-802bab187e9d req-c5c665a1-dd47-47e0-968d-6df8b0c68d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received unexpected event network-vif-plugged-f32103d7-9e67-4711-8680-5684cc43d30e for instance with vm_state active and task_state None.
Dec 03 00:07:15 compute-1 nova_compute[187157]: 2025-12-03 00:07:15.725 187161 DEBUG oslo_concurrency.lockutils [None req-603c12be-5f31-4e27-98e0-a91802ddd93d d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.016s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:17 compute-1 nova_compute[187157]: 2025-12-03 00:07:17.592 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:18 compute-1 podman[214843]: 2025-12-03 00:07:18.267684511 +0000 UTC m=+0.109103453 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: ERROR   00:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: ERROR   00:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: ERROR   00:07:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: ERROR   00:07:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: ERROR   00:07:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:07:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:07:20 compute-1 podman[214870]: 2025-12-03 00:07:20.236310032 +0000 UTC m=+0.078382901 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Dec 03 00:07:20 compute-1 nova_compute[187157]: 2025-12-03 00:07:20.286 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:22 compute-1 nova_compute[187157]: 2025-12-03 00:07:22.595 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:25 compute-1 nova_compute[187157]: 2025-12-03 00:07:25.287 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:27 compute-1 ovn_controller[95464]: 2025-12-03T00:07:27Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:63:08 10.100.0.11
Dec 03 00:07:27 compute-1 ovn_controller[95464]: 2025-12-03T00:07:27Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:63:08 10.100.0.11
Dec 03 00:07:27 compute-1 nova_compute[187157]: 2025-12-03 00:07:27.596 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:29 compute-1 nova_compute[187157]: 2025-12-03 00:07:29.865 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Creating tmpfile /var/lib/nova/instances/tmp6wlbjtie to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:07:29 compute-1 nova_compute[187157]: 2025-12-03 00:07:29.866 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:29 compute-1 nova_compute[187157]: 2025-12-03 00:07:29.877 187161 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:07:30 compute-1 nova_compute[187157]: 2025-12-03 00:07:30.333 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:31 compute-1 nova_compute[187157]: 2025-12-03 00:07:31.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:31 compute-1 nova_compute[187157]: 2025-12-03 00:07:31.941 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:32 compute-1 podman[214904]: 2025-12-03 00:07:32.228885191 +0000 UTC m=+0.064474619 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:07:32 compute-1 nova_compute[187157]: 2025-12-03 00:07:32.599 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:33 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:07:34 compute-1 nova_compute[187157]: 2025-12-03 00:07:34.210 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:35 compute-1 podman[214926]: 2025-12-03 00:07:35.228140047 +0000 UTC m=+0.066789035 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 03 00:07:35 compute-1 nova_compute[187157]: 2025-12-03 00:07:35.335 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:35 compute-1 podman[197537]: time="2025-12-03T00:07:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:07:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:07:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:07:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:07:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Dec 03 00:07:36 compute-1 nova_compute[187157]: 2025-12-03 00:07:36.872 187161 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5187b0f8-a8d1-4c99-a0b9-809caf89b88a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:07:37 compute-1 nova_compute[187157]: 2025-12-03 00:07:37.601 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:37 compute-1 nova_compute[187157]: 2025-12-03 00:07:37.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:37 compute-1 nova_compute[187157]: 2025-12-03 00:07:37.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:07:37 compute-1 nova_compute[187157]: 2025-12-03 00:07:37.890 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:07:37 compute-1 nova_compute[187157]: 2025-12-03 00:07:37.891 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:07:37 compute-1 nova_compute[187157]: 2025-12-03 00:07:37.891 187161 DEBUG nova.network.neutron [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:07:38 compute-1 nova_compute[187157]: 2025-12-03 00:07:38.399 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.083 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.261 187161 DEBUG nova.network.neutron [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.771 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.789 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5187b0f8-a8d1-4c99-a0b9-809caf89b88a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.790 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Creating instance directory: /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.790 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Creating disk.info with the contents: {'/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk': 'qcow2', '/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.791 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:07:39 compute-1 nova_compute[187157]: 2025-12-03 00:07:39.791 187161 DEBUG nova.objects.instance [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5187b0f8-a8d1-4c99-a0b9-809caf89b88a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.209 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.298 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.302 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.303 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.336 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.375 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.376 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.377 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.378 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.385 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.385 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.474 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.475 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.517 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.518 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.519 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.590 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.591 187161 DEBUG nova.virt.disk.api [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.591 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.648 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.650 187161 DEBUG nova.virt.disk.api [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.651 187161 DEBUG nova.objects.instance [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 5187b0f8-a8d1-4c99-a0b9-809caf89b88a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:07:40 compute-1 nova_compute[187157]: 2025-12-03 00:07:40.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.163 187161 DEBUG nova.objects.base [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<5187b0f8-a8d1-4c99-a0b9-809caf89b88a> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.164 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.194 187161 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config 497664" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.196 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.198 187161 DEBUG nova.virt.libvirt.vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-129',id=16,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:06:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-0lswcfs9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:06:55Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.198 187161 DEBUG nova.network.os_vif_util [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.200 187161 DEBUG nova.network.os_vif_util [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.201 187161 DEBUG os_vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.202 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.203 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.204 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.206 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.206 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '11d1877e-6ca7-5dfb-a517-1864e0abc9eb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.208 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.210 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.214 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.217 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.217 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebecba8e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.218 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapebecba8e-a0, col_values=(('qos', UUID('acb69e85-2c92-41d0-9786-1cdacd991b24')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.218 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapebecba8e-a0, col_values=(('external_ids', {'iface-id': 'ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:13:00', 'vm-uuid': '5187b0f8-a8d1-4c99-a0b9-809caf89b88a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.220 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 NetworkManager[55553]: <info>  [1764720461.2215] manager: (tapebecba8e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.221 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.227 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.228 187161 INFO os_vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0')
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.229 187161 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.229 187161 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5187b0f8-a8d1-4c99-a0b9-809caf89b88a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.230 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.511 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:41 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:41.781 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:07:41 compute-1 nova_compute[187157]: 2025-12-03 00:07:41.782 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:41.782 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.125 187161 DEBUG nova.network.neutron [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.141 187161 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5187b0f8-a8d1-4c99-a0b9-809caf89b88a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.259 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.309 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.310 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.360 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.491 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.492 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.529 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.529 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5622MB free_disk=73.13680267333984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.530 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.530 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:42 compute-1 nova_compute[187157]: 2025-12-03 00:07:42.603 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:42.783 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:43 compute-1 ovn_controller[95464]: 2025-12-03T00:07:43Z|00157|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:07:43 compute-1 podman[214974]: 2025-12-03 00:07:43.216145027 +0000 UTC m=+0.053981768 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:07:43 compute-1 nova_compute[187157]: 2025-12-03 00:07:43.564 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 5187b0f8-a8d1-4c99-a0b9-809caf89b88a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:07:44 compute-1 nova_compute[187157]: 2025-12-03 00:07:44.071 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating resource usage from migration 9e7124fa-e997-4c19-b812-98c74391064a
Dec 03 00:07:44 compute-1 nova_compute[187157]: 2025-12-03 00:07:44.072 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Starting to track incoming migration 9e7124fa-e997-4c19-b812-98c74391064a with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:07:44 compute-1 nova_compute[187157]: 2025-12-03 00:07:44.613 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 7245dda2-9732-4cc5-acfc-c277bdea6b4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.119 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 5187b0f8-a8d1-4c99-a0b9-809caf89b88a has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.119 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.119 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:07:42 up  1:14,  0 user,  load average: 0.53, 0.42, 0.39\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.174 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:07:45 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:07:45 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:07:45 compute-1 kernel: tapebecba8e-a0: entered promiscuous mode
Dec 03 00:07:45 compute-1 NetworkManager[55553]: <info>  [1764720465.6408] manager: (tapebecba8e-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.641 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:45 compute-1 ovn_controller[95464]: 2025-12-03T00:07:45Z|00158|binding|INFO|Claiming lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for this additional chassis.
Dec 03 00:07:45 compute-1 ovn_controller[95464]: 2025-12-03T00:07:45Z|00159|binding|INFO|ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0: Claiming fa:16:3e:7d:13:00 10.100.0.14
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.650 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:13:00 10.100.0.14'], port_security=['fa:16:3e:7d:13:00 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5187b0f8-a8d1-4c99-a0b9-809caf89b88a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '10', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.652 104348 INFO neutron.agent.ovn.metadata.agent [-] Port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.653 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:07:45 compute-1 systemd-udevd[215030]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:07:45 compute-1 ovn_controller[95464]: 2025-12-03T00:07:45Z|00160|binding|INFO|Setting lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 ovn-installed in OVS
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.672 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.673 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.676 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[580379aa-81fb-49d4-bb79-88cb4767c422]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.680 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.683 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:45 compute-1 NetworkManager[55553]: <info>  [1764720465.6859] device (tapebecba8e-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:07:45 compute-1 NetworkManager[55553]: <info>  [1764720465.6902] device (tapebecba8e-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:07:45 compute-1 systemd-machined[153454]: New machine qemu-14-instance-00000010.
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.715 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[fae5fd82-ed84-49a6-975c-014f06e41156]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.718 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d58b8e11-f560-4617-9eff-b6faaaa1659e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:45 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000010.
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.749 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[695d4f6c-4bf7-471a-a207-8dd9faa489cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.766 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[dd029b6b-6eaa-4328-bafa-72a2577c8d15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445267, 'reachable_time': 33883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215040, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.787 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7bca01ac-1e62-467e-b4f6-be142ec99356]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445275, 'tstamp': 445275}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215046, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445277, 'tstamp': 445277}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215046, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.789 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.790 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:45 compute-1 nova_compute[187157]: 2025-12-03 00:07:45.791 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.792 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.793 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.793 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.794 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:07:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:07:45.795 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5babbb-f33f-41f8-92df-8671559d1c54]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.190 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.190 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.661s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.220 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.702 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:46 compute-1 nova_compute[187157]: 2025-12-03 00:07:46.702 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:07:47 compute-1 nova_compute[187157]: 2025-12-03 00:07:47.210 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:07:47 compute-1 nova_compute[187157]: 2025-12-03 00:07:47.605 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-1 nova_compute[187157]: 2025-12-03 00:07:49.208 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:49 compute-1 nova_compute[187157]: 2025-12-03 00:07:49.208 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:07:49 compute-1 podman[215069]: 2025-12-03 00:07:49.240903855 +0000 UTC m=+0.083788260 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: ERROR   00:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: ERROR   00:07:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: ERROR   00:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: ERROR   00:07:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: ERROR   00:07:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:07:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:07:49 compute-1 ovn_controller[95464]: 2025-12-03T00:07:49Z|00161|binding|INFO|Claiming lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for this chassis.
Dec 03 00:07:49 compute-1 ovn_controller[95464]: 2025-12-03T00:07:49Z|00162|binding|INFO|ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0: Claiming fa:16:3e:7d:13:00 10.100.0.14
Dec 03 00:07:49 compute-1 ovn_controller[95464]: 2025-12-03T00:07:49Z|00163|binding|INFO|Setting lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 up in Southbound
Dec 03 00:07:50 compute-1 nova_compute[187157]: 2025-12-03 00:07:50.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:50 compute-1 nova_compute[187157]: 2025-12-03 00:07:50.961 187161 INFO nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Post operation of migration started
Dec 03 00:07:50 compute-1 nova_compute[187157]: 2025-12-03 00:07:50.962 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.223 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:51 compute-1 podman[215097]: 2025-12-03 00:07:51.240942856 +0000 UTC m=+0.066630631 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.511 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.512 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.646 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.646 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.646 187161 DEBUG nova.network.neutron [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:07:51 compute-1 nova_compute[187157]: 2025-12-03 00:07:51.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:52 compute-1 nova_compute[187157]: 2025-12-03 00:07:52.152 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:52 compute-1 nova_compute[187157]: 2025-12-03 00:07:52.606 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:52 compute-1 nova_compute[187157]: 2025-12-03 00:07:52.754 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:52 compute-1 nova_compute[187157]: 2025-12-03 00:07:52.899 187161 DEBUG nova.network.neutron [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:07:53 compute-1 nova_compute[187157]: 2025-12-03 00:07:53.407 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:07:53 compute-1 nova_compute[187157]: 2025-12-03 00:07:53.924 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:53 compute-1 nova_compute[187157]: 2025-12-03 00:07:53.925 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:53 compute-1 nova_compute[187157]: 2025-12-03 00:07:53.925 187161 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:53 compute-1 nova_compute[187157]: 2025-12-03 00:07:53.933 187161 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:07:53 compute-1 virtqemud[186882]: Domain id=14 name='instance-00000010' uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a is tainted: custom-monitor
Dec 03 00:07:54 compute-1 nova_compute[187157]: 2025-12-03 00:07:54.943 187161 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:07:55 compute-1 nova_compute[187157]: 2025-12-03 00:07:55.951 187161 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:07:55 compute-1 nova_compute[187157]: 2025-12-03 00:07:55.957 187161 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:07:56 compute-1 nova_compute[187157]: 2025-12-03 00:07:56.274 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:56 compute-1 nova_compute[187157]: 2025-12-03 00:07:56.474 187161 DEBUG nova.objects.instance [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:07:57 compute-1 nova_compute[187157]: 2025-12-03 00:07:57.498 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:57 compute-1 nova_compute[187157]: 2025-12-03 00:07:57.588 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:57 compute-1 nova_compute[187157]: 2025-12-03 00:07:57.589 187161 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:57 compute-1 nova_compute[187157]: 2025-12-03 00:07:57.608 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:01 compute-1 nova_compute[187157]: 2025-12-03 00:08:01.276 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:01.727 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:01.727 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:01.728 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:02 compute-1 nova_compute[187157]: 2025-12-03 00:08:02.610 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:03 compute-1 podman[215116]: 2025-12-03 00:08:03.225000318 +0000 UTC m=+0.056322321 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.317 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.318 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.318 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.318 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.318 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.329 187161 INFO nova.compute.manager [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Terminating instance
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.846 187161 DEBUG nova.compute.manager [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:08:04 compute-1 kernel: tapf32103d7-9e (unregistering): left promiscuous mode
Dec 03 00:08:04 compute-1 NetworkManager[55553]: <info>  [1764720484.8814] device (tapf32103d7-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:08:04 compute-1 ovn_controller[95464]: 2025-12-03T00:08:04Z|00164|binding|INFO|Releasing lport f32103d7-9e67-4711-8680-5684cc43d30e from this chassis (sb_readonly=0)
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.886 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:04 compute-1 ovn_controller[95464]: 2025-12-03T00:08:04Z|00165|binding|INFO|Setting lport f32103d7-9e67-4711-8680-5684cc43d30e down in Southbound
Dec 03 00:08:04 compute-1 ovn_controller[95464]: 2025-12-03T00:08:04Z|00166|binding|INFO|Removing iface tapf32103d7-9e ovn-installed in OVS
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.890 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:04.914 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:63:08 10.100.0.11'], port_security=['fa:16:3e:98:63:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7245dda2-9732-4cc5-acfc-c277bdea6b4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '5', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=f32103d7-9e67-4711-8680-5684cc43d30e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:08:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:04.917 104348 INFO neutron.agent.ovn.metadata.agent [-] Port f32103d7-9e67-4711-8680-5684cc43d30e in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:08:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:04.920 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:04 compute-1 nova_compute[187157]: 2025-12-03 00:08:04.927 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:04.933 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[385f01b3-7585-4616-b39f-aae8a6babb2d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:04 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Dec 03 00:08:04 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 14.629s CPU time.
Dec 03 00:08:04 compute-1 systemd-machined[153454]: Machine qemu-13-instance-00000011 terminated.
Dec 03 00:08:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:04.982 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[adf5f8b4-e940-4aee-8a22-d66aeaf468b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:04.985 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8609a5fc-c379-44fb-9597-7b1f51090a82]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.021 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[7931b7bb-c2ee-455d-be5c-7fc1157d97fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.040 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[29cc9aa8-2dac-4e4c-9b3b-b63d50990419]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445267, 'reachable_time': 33883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215152, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.046 187161 DEBUG nova.compute.manager [req-cec992f2-c3b9-40a9-9034-0192411f9052 req-566dea22-a4a5-4694-a599-91b0f8d25950 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-unplugged-f32103d7-9e67-4711-8680-5684cc43d30e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.047 187161 DEBUG oslo_concurrency.lockutils [req-cec992f2-c3b9-40a9-9034-0192411f9052 req-566dea22-a4a5-4694-a599-91b0f8d25950 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.047 187161 DEBUG oslo_concurrency.lockutils [req-cec992f2-c3b9-40a9-9034-0192411f9052 req-566dea22-a4a5-4694-a599-91b0f8d25950 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.047 187161 DEBUG oslo_concurrency.lockutils [req-cec992f2-c3b9-40a9-9034-0192411f9052 req-566dea22-a4a5-4694-a599-91b0f8d25950 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.047 187161 DEBUG nova.compute.manager [req-cec992f2-c3b9-40a9-9034-0192411f9052 req-566dea22-a4a5-4694-a599-91b0f8d25950 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] No waiting events found dispatching network-vif-unplugged-f32103d7-9e67-4711-8680-5684cc43d30e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.047 187161 DEBUG nova.compute.manager [req-cec992f2-c3b9-40a9-9034-0192411f9052 req-566dea22-a4a5-4694-a599-91b0f8d25950 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-unplugged-f32103d7-9e67-4711-8680-5684cc43d30e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.054 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[69373479-59b6-4ebf-a7d9-eede0210e9db]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445275, 'tstamp': 445275}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215153, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445277, 'tstamp': 445277}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215153, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.055 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.056 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.060 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.061 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.061 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.061 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.062 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:08:05 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:05.063 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1b54eb40-58eb-4f53-b8ad-e5614a346198]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.115 187161 INFO nova.virt.libvirt.driver [-] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Instance destroyed successfully.
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.116 187161 DEBUG nova.objects.instance [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'resources' on Instance uuid 7245dda2-9732-4cc5-acfc-c277bdea6b4f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:08:05 compute-1 podman[215172]: 2025-12-03 00:08:05.477256484 +0000 UTC m=+0.060502351 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, config_id=multipathd, io.buildah.version=1.41.4)
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.628 187161 DEBUG nova.virt.libvirt.vif [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:06:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-528950227',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-528',id=17,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:07:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-rsnmfdif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:07:14Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=7245dda2-9732-4cc5-acfc-c277bdea6b4f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.629 187161 DEBUG nova.network.os_vif_util [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "f32103d7-9e67-4711-8680-5684cc43d30e", "address": "fa:16:3e:98:63:08", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf32103d7-9e", "ovs_interfaceid": "f32103d7-9e67-4711-8680-5684cc43d30e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.630 187161 DEBUG nova.network.os_vif_util [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.630 187161 DEBUG os_vif [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.633 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.633 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32103d7-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.635 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.636 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.638 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.638 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=12b22bd3-2110-4e58-9d45-4de5bc510cb2) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.639 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.640 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.644 187161 INFO os_vif [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:63:08,bridge_name='br-int',has_traffic_filtering=True,id=f32103d7-9e67-4711-8680-5684cc43d30e,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf32103d7-9e')
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.645 187161 INFO nova.virt.libvirt.driver [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Deleting instance files /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f_del
Dec 03 00:08:05 compute-1 podman[197537]: time="2025-12-03T00:08:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:08:05 compute-1 nova_compute[187157]: 2025-12-03 00:08:05.646 187161 INFO nova.virt.libvirt.driver [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Deletion of /var/lib/nova/instances/7245dda2-9732-4cc5-acfc-c277bdea6b4f_del complete
Dec 03 00:08:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:08:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:08:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:08:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3071 "" "Go-http-client/1.1"
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.160 187161 INFO nova.compute.manager [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Took 1.31 seconds to destroy the instance on the hypervisor.
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.160 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.160 187161 DEBUG nova.compute.manager [-] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.161 187161 DEBUG nova.network.neutron [-] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.161 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.508 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.888 187161 DEBUG nova.compute.manager [req-5bb740c0-94b5-495b-809a-5a265e3eb53d req-6647d63c-8391-428b-baf5-ac23fe49e209 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-deleted-f32103d7-9e67-4711-8680-5684cc43d30e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.888 187161 INFO nova.compute.manager [req-5bb740c0-94b5-495b-809a-5a265e3eb53d req-6647d63c-8391-428b-baf5-ac23fe49e209 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Neutron deleted interface f32103d7-9e67-4711-8680-5684cc43d30e; detaching it from the instance and deleting it from the info cache
Dec 03 00:08:06 compute-1 nova_compute[187157]: 2025-12-03 00:08:06.888 187161 DEBUG nova.network.neutron [req-5bb740c0-94b5-495b-809a-5a265e3eb53d req-6647d63c-8391-428b-baf5-ac23fe49e209 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.296 187161 DEBUG nova.compute.manager [req-a3725163-be2d-44be-8541-d8498417e955 req-1d0841bb-cbf7-44a0-968a-faf2709907e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-unplugged-f32103d7-9e67-4711-8680-5684cc43d30e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.296 187161 DEBUG oslo_concurrency.lockutils [req-a3725163-be2d-44be-8541-d8498417e955 req-1d0841bb-cbf7-44a0-968a-faf2709907e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.296 187161 DEBUG oslo_concurrency.lockutils [req-a3725163-be2d-44be-8541-d8498417e955 req-1d0841bb-cbf7-44a0-968a-faf2709907e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.297 187161 DEBUG oslo_concurrency.lockutils [req-a3725163-be2d-44be-8541-d8498417e955 req-1d0841bb-cbf7-44a0-968a-faf2709907e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.297 187161 DEBUG nova.compute.manager [req-a3725163-be2d-44be-8541-d8498417e955 req-1d0841bb-cbf7-44a0-968a-faf2709907e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] No waiting events found dispatching network-vif-unplugged-f32103d7-9e67-4711-8680-5684cc43d30e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.297 187161 DEBUG nova.compute.manager [req-a3725163-be2d-44be-8541-d8498417e955 req-1d0841bb-cbf7-44a0-968a-faf2709907e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Received event network-vif-unplugged-f32103d7-9e67-4711-8680-5684cc43d30e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.297 187161 DEBUG nova.network.neutron [-] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.395 187161 DEBUG nova.compute.manager [req-5bb740c0-94b5-495b-809a-5a265e3eb53d req-6647d63c-8391-428b-baf5-ac23fe49e209 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Detach interface failed, port_id=f32103d7-9e67-4711-8680-5684cc43d30e, reason: Instance 7245dda2-9732-4cc5-acfc-c277bdea6b4f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.611 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:07 compute-1 nova_compute[187157]: 2025-12-03 00:08:07.805 187161 INFO nova.compute.manager [-] [instance: 7245dda2-9732-4cc5-acfc-c277bdea6b4f] Took 1.64 seconds to deallocate network for instance.
Dec 03 00:08:08 compute-1 nova_compute[187157]: 2025-12-03 00:08:08.323 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:08 compute-1 nova_compute[187157]: 2025-12-03 00:08:08.323 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:08 compute-1 nova_compute[187157]: 2025-12-03 00:08:08.378 187161 DEBUG nova.compute.provider_tree [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:08:08 compute-1 nova_compute[187157]: 2025-12-03 00:08:08.893 187161 DEBUG nova.scheduler.client.report [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:08:09 compute-1 nova_compute[187157]: 2025-12-03 00:08:09.406 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:09 compute-1 nova_compute[187157]: 2025-12-03 00:08:09.427 187161 INFO nova.scheduler.client.report [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Deleted allocations for instance 7245dda2-9732-4cc5-acfc-c277bdea6b4f
Dec 03 00:08:10 compute-1 nova_compute[187157]: 2025-12-03 00:08:10.471 187161 DEBUG oslo_concurrency.lockutils [None req-98b6b07c-becf-4f76-ae3d-e3fbbb468221 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "7245dda2-9732-4cc5-acfc-c277bdea6b4f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.153s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:10 compute-1 nova_compute[187157]: 2025-12-03 00:08:10.639 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.260 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.260 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.261 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.262 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.262 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.275 187161 INFO nova.compute.manager [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Terminating instance
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.613 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.794 187161 DEBUG nova.compute.manager [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:08:12 compute-1 kernel: tapebecba8e-a0 (unregistering): left promiscuous mode
Dec 03 00:08:12 compute-1 NetworkManager[55553]: <info>  [1764720492.8279] device (tapebecba8e-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.835 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:12 compute-1 ovn_controller[95464]: 2025-12-03T00:08:12Z|00167|binding|INFO|Releasing lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 from this chassis (sb_readonly=0)
Dec 03 00:08:12 compute-1 ovn_controller[95464]: 2025-12-03T00:08:12Z|00168|binding|INFO|Setting lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 down in Southbound
Dec 03 00:08:12 compute-1 ovn_controller[95464]: 2025-12-03T00:08:12Z|00169|binding|INFO|Removing iface tapebecba8e-a0 ovn-installed in OVS
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.837 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:12.845 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:13:00 10.100.0.14'], port_security=['fa:16:3e:7d:13:00 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5187b0f8-a8d1-4c99-a0b9-809caf89b88a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '15', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:08:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:12.846 104348 INFO neutron.agent.ovn.metadata.agent [-] Port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:08:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:12.847 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:08:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:12.848 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1d38477a-520e-43c6-8708-2d50d1aca866]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:12 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:12.849 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace which is not needed anymore
Dec 03 00:08:12 compute-1 nova_compute[187157]: 2025-12-03 00:08:12.856 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:12 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec 03 00:08:12 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Consumed 2.920s CPU time.
Dec 03 00:08:12 compute-1 systemd-machined[153454]: Machine qemu-14-instance-00000010 terminated.
Dec 03 00:08:12 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [NOTICE]   (214832) : haproxy version is 3.0.5-8e879a5
Dec 03 00:08:12 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [NOTICE]   (214832) : path to executable is /usr/sbin/haproxy
Dec 03 00:08:12 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [WARNING]  (214832) : Exiting Master process...
Dec 03 00:08:12 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [ALERT]    (214832) : Current worker (214834) exited with code 143 (Terminated)
Dec 03 00:08:12 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[214828]: [WARNING]  (214832) : All workers exited. Exiting... (0)
Dec 03 00:08:12 compute-1 podman[215216]: 2025-12-03 00:08:12.976148968 +0000 UTC m=+0.035942532 container kill b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Dec 03 00:08:12 compute-1 systemd[1]: libpod-b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612.scope: Deactivated successfully.
Dec 03 00:08:13 compute-1 podman[215231]: 2025-12-03 00:08:13.059254181 +0000 UTC m=+0.066042035 container died b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.059 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.070 187161 DEBUG nova.compute.manager [req-2fa4bac4-998b-456b-bb84-b806f78dc5ba req-aa2c03ac-6efc-4044-ade6-bb4d1c193859 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.070 187161 DEBUG oslo_concurrency.lockutils [req-2fa4bac4-998b-456b-bb84-b806f78dc5ba req-aa2c03ac-6efc-4044-ade6-bb4d1c193859 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.071 187161 DEBUG oslo_concurrency.lockutils [req-2fa4bac4-998b-456b-bb84-b806f78dc5ba req-aa2c03ac-6efc-4044-ade6-bb4d1c193859 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.071 187161 DEBUG oslo_concurrency.lockutils [req-2fa4bac4-998b-456b-bb84-b806f78dc5ba req-aa2c03ac-6efc-4044-ade6-bb4d1c193859 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.071 187161 DEBUG nova.compute.manager [req-2fa4bac4-998b-456b-bb84-b806f78dc5ba req-aa2c03ac-6efc-4044-ade6-bb4d1c193859 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.071 187161 DEBUG nova.compute.manager [req-2fa4bac4-998b-456b-bb84-b806f78dc5ba req-aa2c03ac-6efc-4044-ade6-bb4d1c193859 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:08:13 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612-userdata-shm.mount: Deactivated successfully.
Dec 03 00:08:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-b42e476cb975dea83bc4655999d75c5acf006989e90c7537ebb8bc26effb602a-merged.mount: Deactivated successfully.
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.095 187161 INFO nova.virt.libvirt.driver [-] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance destroyed successfully.
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.095 187161 DEBUG nova.objects.instance [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'resources' on Instance uuid 5187b0f8-a8d1-4c99-a0b9-809caf89b88a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:08:13 compute-1 podman[215231]: 2025-12-03 00:08:13.101816641 +0000 UTC m=+0.108604475 container cleanup b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 03 00:08:13 compute-1 systemd[1]: libpod-conmon-b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612.scope: Deactivated successfully.
Dec 03 00:08:13 compute-1 podman[215233]: 2025-12-03 00:08:13.118108902 +0000 UTC m=+0.111864303 container remove b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.122 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[483c675d-eb21-4d2b-912c-f290d7501ee8]: (4, ("Wed Dec  3 12:08:12 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612)\nb14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612\nWed Dec  3 12:08:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (b14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612)\nb14f0ecb0740de2325664d82263f80e5f3184caf4d3986b6ccfceb6a6140d612\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.123 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fc702b0b-7a6f-4419-87b3-7dbb6611c61a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.124 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.124 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9f258e-519b-47f5-8c62-8e511bfecd70]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.125 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.126 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 kernel: tap9c6ad8f4-60: left promiscuous mode
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.141 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.144 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f33f9389-9b13-4f9b-bc53-46961e579487]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.162 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fb1c68-077e-40c0-a641-246ad495b89d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.164 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e223ad01-f1f1-440d-9efe-936468300506]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.182 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d652743e-c3cc-41cf-a86b-9b8bfe860141]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445261, 'reachable_time': 18054, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215279, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.185 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:08:13 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:13.185 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[06131ee9-44a9-4cdb-af11-79a175ad12c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:13 compute-1 systemd[1]: run-netns-ovnmeta\x2d9c6ad8f4\x2d62a9\x2d4a0d\x2dac57\x2de980ee855c68.mount: Deactivated successfully.
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.605 187161 DEBUG nova.virt.libvirt.vif [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-129',id=16,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:06:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-0lswcfs9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:07:57Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.606 187161 DEBUG nova.network.os_vif_util [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.607 187161 DEBUG nova.network.os_vif_util [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.608 187161 DEBUG os_vif [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.610 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.611 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebecba8e-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.613 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.614 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.615 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.616 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=acb69e85-2c92-41d0-9786-1cdacd991b24) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.617 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.619 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.622 187161 INFO os_vif [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0')
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.623 187161 INFO nova.virt.libvirt.driver [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Deleting instance files /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a_del
Dec 03 00:08:13 compute-1 nova_compute[187157]: 2025-12-03 00:08:13.623 187161 INFO nova.virt.libvirt.driver [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Deletion of /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a_del complete
Dec 03 00:08:14 compute-1 nova_compute[187157]: 2025-12-03 00:08:14.136 187161 INFO nova.compute.manager [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Took 1.34 seconds to destroy the instance on the hypervisor.
Dec 03 00:08:14 compute-1 nova_compute[187157]: 2025-12-03 00:08:14.137 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:08:14 compute-1 nova_compute[187157]: 2025-12-03 00:08:14.137 187161 DEBUG nova.compute.manager [-] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:08:14 compute-1 nova_compute[187157]: 2025-12-03 00:08:14.137 187161 DEBUG nova.network.neutron [-] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:08:14 compute-1 nova_compute[187157]: 2025-12-03 00:08:14.138 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:14 compute-1 sshd-session[215280]: Invalid user sol from 193.32.162.146 port 54258
Dec 03 00:08:14 compute-1 podman[215282]: 2025-12-03 00:08:14.255846389 +0000 UTC m=+0.101839242 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:08:14 compute-1 nova_compute[187157]: 2025-12-03 00:08:14.281 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:14 compute-1 sshd-session[215280]: Connection closed by invalid user sol 193.32.162.146 port 54258 [preauth]
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.171 187161 DEBUG nova.network.neutron [-] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.269 187161 DEBUG nova.compute.manager [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.269 187161 DEBUG oslo_concurrency.lockutils [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.269 187161 DEBUG oslo_concurrency.lockutils [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.269 187161 DEBUG oslo_concurrency.lockutils [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.270 187161 DEBUG nova.compute.manager [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.270 187161 DEBUG nova.compute.manager [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.270 187161 DEBUG nova.compute.manager [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-deleted-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.270 187161 INFO nova.compute.manager [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Neutron deleted interface ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0; detaching it from the instance and deleting it from the info cache
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.270 187161 DEBUG nova.network.neutron [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.772 187161 INFO nova.compute.manager [-] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Took 1.63 seconds to deallocate network for instance.
Dec 03 00:08:15 compute-1 nova_compute[187157]: 2025-12-03 00:08:15.780 187161 DEBUG nova.compute.manager [req-e6f06a56-2367-4c81-b9ff-27af26a1dc79 req-fcf578cc-00bf-4512-a129-8e306e4a3bae 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Detach interface failed, port_id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0, reason: Instance 5187b0f8-a8d1-4c99-a0b9-809caf89b88a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:08:16 compute-1 nova_compute[187157]: 2025-12-03 00:08:16.291 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:16 compute-1 nova_compute[187157]: 2025-12-03 00:08:16.291 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:16 compute-1 nova_compute[187157]: 2025-12-03 00:08:16.296 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:16 compute-1 nova_compute[187157]: 2025-12-03 00:08:16.332 187161 INFO nova.scheduler.client.report [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Deleted allocations for instance 5187b0f8-a8d1-4c99-a0b9-809caf89b88a
Dec 03 00:08:17 compute-1 nova_compute[187157]: 2025-12-03 00:08:17.360 187161 DEBUG oslo_concurrency.lockutils [None req-bc26980b-4253-47c9-a2f8-049f5ddb9fb7 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:17 compute-1 nova_compute[187157]: 2025-12-03 00:08:17.616 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:18 compute-1 nova_compute[187157]: 2025-12-03 00:08:18.618 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: ERROR   00:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: ERROR   00:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: ERROR   00:08:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: ERROR   00:08:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: ERROR   00:08:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:08:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:08:20 compute-1 podman[215307]: 2025-12-03 00:08:20.245252933 +0000 UTC m=+0.091918254 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:08:22 compute-1 podman[215333]: 2025-12-03 00:08:22.207263571 +0000 UTC m=+0.048428361 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Dec 03 00:08:22 compute-1 nova_compute[187157]: 2025-12-03 00:08:22.617 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:23 compute-1 nova_compute[187157]: 2025-12-03 00:08:23.619 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:27 compute-1 nova_compute[187157]: 2025-12-03 00:08:27.619 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:28 compute-1 nova_compute[187157]: 2025-12-03 00:08:28.621 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-1 nova_compute[187157]: 2025-12-03 00:08:32.621 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:33 compute-1 nova_compute[187157]: 2025-12-03 00:08:33.623 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:33 compute-1 nova_compute[187157]: 2025-12-03 00:08:33.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:34 compute-1 podman[215353]: 2025-12-03 00:08:34.215482155 +0000 UTC m=+0.054915267 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Dec 03 00:08:35 compute-1 podman[197537]: time="2025-12-03T00:08:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:08:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:08:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:08:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:08:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2613 "" "Go-http-client/1.1"
Dec 03 00:08:36 compute-1 podman[215376]: 2025-12-03 00:08:36.212907692 +0000 UTC m=+0.057197002 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:08:37 compute-1 nova_compute[187157]: 2025-12-03 00:08:37.624 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:38 compute-1 nova_compute[187157]: 2025-12-03 00:08:38.700 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:40 compute-1 nova_compute[187157]: 2025-12-03 00:08:40.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:41 compute-1 nova_compute[187157]: 2025-12-03 00:08:41.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:42 compute-1 nova_compute[187157]: 2025-12-03 00:08:42.416 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:42 compute-1 nova_compute[187157]: 2025-12-03 00:08:42.417 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:42 compute-1 nova_compute[187157]: 2025-12-03 00:08:42.627 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:42 compute-1 nova_compute[187157]: 2025-12-03 00:08:42.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:42 compute-1 nova_compute[187157]: 2025-12-03 00:08:42.922 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.213 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.502 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.502 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.508 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.508 187161 INFO nova.compute.claims [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.552 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.553 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.574 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.575 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5821MB free_disk=73.16621017456055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.575 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:43 compute-1 nova_compute[187157]: 2025-12-03 00:08:43.702 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:44 compute-1 nova_compute[187157]: 2025-12-03 00:08:44.662 187161 DEBUG nova.scheduler.client.report [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:08:44 compute-1 nova_compute[187157]: 2025-12-03 00:08:44.710 187161 DEBUG nova.scheduler.client.report [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:08:44 compute-1 nova_compute[187157]: 2025-12-03 00:08:44.711 187161 DEBUG nova.compute.provider_tree [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:08:44 compute-1 nova_compute[187157]: 2025-12-03 00:08:44.721 187161 DEBUG nova.scheduler.client.report [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:08:44 compute-1 nova_compute[187157]: 2025-12-03 00:08:44.740 187161 DEBUG nova.scheduler.client.report [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:08:44 compute-1 nova_compute[187157]: 2025-12-03 00:08:44.777 187161 DEBUG nova.compute.provider_tree [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:08:45 compute-1 podman[215398]: 2025-12-03 00:08:45.200499456 +0000 UTC m=+0.046701650 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:08:45 compute-1 nova_compute[187157]: 2025-12-03 00:08:45.472 187161 DEBUG nova.scheduler.client.report [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:08:46 compute-1 nova_compute[187157]: 2025-12-03 00:08:46.051 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.549s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:46 compute-1 nova_compute[187157]: 2025-12-03 00:08:46.052 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:08:46 compute-1 nova_compute[187157]: 2025-12-03 00:08:46.054 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 2.478s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.157 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.158 187161 DEBUG nova.network.neutron [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.158 187161 WARNING neutronclient.v2_0.client [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.158 187161 WARNING neutronclient.v2_0.client [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.314 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.314 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.314 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:08:43 up  1:15,  0 user,  load average: 0.56, 0.47, 0.41\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.358 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.628 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.665 187161 INFO nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:08:47 compute-1 nova_compute[187157]: 2025-12-03 00:08:47.928 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:08:48 compute-1 nova_compute[187157]: 2025-12-03 00:08:48.245 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:08:48 compute-1 nova_compute[187157]: 2025-12-03 00:08:48.311 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:48.311 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:08:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:48.315 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:08:48 compute-1 nova_compute[187157]: 2025-12-03 00:08:48.471 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:08:48 compute-1 nova_compute[187157]: 2025-12-03 00:08:48.472 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.418s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:48 compute-1 nova_compute[187157]: 2025-12-03 00:08:48.704 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:48 compute-1 nova_compute[187157]: 2025-12-03 00:08:48.896 187161 DEBUG nova.network.neutron [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Successfully created port: 8cea46ae-d25b-44ab-826a-ac08e1df95d3 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.368 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.370 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.370 187161 INFO nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Creating image(s)
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.370 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "/var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.371 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.371 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.372 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.375 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.376 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: ERROR   00:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: ERROR   00:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: ERROR   00:08:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: ERROR   00:08:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: ERROR   00:08:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:08:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.440 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.441 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.442 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.442 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.446 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.446 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.497 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.498 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.654 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk 1073741824" returned: 0 in 0.157s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.655 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.214s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.656 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.715 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.715 187161 DEBUG nova.virt.disk.api [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Checking if we can resize image /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.716 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.774 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.774 187161 DEBUG nova.virt.disk.api [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Cannot resize image /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.775 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.775 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Ensure instance console log exists: /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.776 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.776 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:49 compute-1 nova_compute[187157]: 2025-12-03 00:08:49.776 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:50 compute-1 nova_compute[187157]: 2025-12-03 00:08:50.796 187161 DEBUG nova.network.neutron [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Successfully updated port: 8cea46ae-d25b-44ab-826a-ac08e1df95d3 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:08:50 compute-1 nova_compute[187157]: 2025-12-03 00:08:50.916 187161 DEBUG nova.compute.manager [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-changed-8cea46ae-d25b-44ab-826a-ac08e1df95d3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:50 compute-1 nova_compute[187157]: 2025-12-03 00:08:50.917 187161 DEBUG nova.compute.manager [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Refreshing instance network info cache due to event network-changed-8cea46ae-d25b-44ab-826a-ac08e1df95d3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:08:50 compute-1 nova_compute[187157]: 2025-12-03 00:08:50.917 187161 DEBUG oslo_concurrency.lockutils [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:08:50 compute-1 nova_compute[187157]: 2025-12-03 00:08:50.917 187161 DEBUG oslo_concurrency.lockutils [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:08:50 compute-1 nova_compute[187157]: 2025-12-03 00:08:50.917 187161 DEBUG nova.network.neutron [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Refreshing network info cache for port 8cea46ae-d25b-44ab-826a-ac08e1df95d3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:08:51 compute-1 podman[215438]: 2025-12-03 00:08:51.27123066 +0000 UTC m=+0.095991122 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.427 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "refresh_cache-1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.431 187161 WARNING neutronclient.v2_0.client [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.468 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.468 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.469 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.527 187161 DEBUG nova.network.neutron [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:08:51 compute-1 nova_compute[187157]: 2025-12-03 00:08:51.731 187161 DEBUG nova.network.neutron [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:52 compute-1 nova_compute[187157]: 2025-12-03 00:08:52.466 187161 DEBUG oslo_concurrency.lockutils [req-73bdb378-107d-4419-93c6-f5c0695d4b4b req-b490cfae-797c-4b11-8500-1e765576a126 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:08:52 compute-1 nova_compute[187157]: 2025-12-03 00:08:52.467 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquired lock "refresh_cache-1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:08:52 compute-1 nova_compute[187157]: 2025-12-03 00:08:52.467 187161 DEBUG nova.network.neutron [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:08:52 compute-1 nova_compute[187157]: 2025-12-03 00:08:52.630 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:52 compute-1 nova_compute[187157]: 2025-12-03 00:08:52.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:52 compute-1 nova_compute[187157]: 2025-12-03 00:08:52.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:53 compute-1 podman[215465]: 2025-12-03 00:08:53.206429695 +0000 UTC m=+0.051600378 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:08:53 compute-1 nova_compute[187157]: 2025-12-03 00:08:53.533 187161 DEBUG nova.network.neutron [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:08:53 compute-1 nova_compute[187157]: 2025-12-03 00:08:53.706 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:53 compute-1 nova_compute[187157]: 2025-12-03 00:08:53.801 187161 WARNING neutronclient.v2_0.client [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:53 compute-1 nova_compute[187157]: 2025-12-03 00:08:53.951 187161 DEBUG nova.network.neutron [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Updating instance_info_cache with network_info: [{"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.466 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Releasing lock "refresh_cache-1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.466 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Instance network_info: |[{"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.469 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Start _get_guest_xml network_info=[{"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.472 187161 WARNING nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.474 187161 DEBUG nova.virt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-692236767', uuid='1e3e5721-9cda-4368-b5c4-c6a8d4d8db95'), owner=OwnerMeta(userid='d7f72082c96e4f868d5b158a57237cee', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin', projectid='869170c9b0864bd8a0f2258e90e55a84', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720534.4740944) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.478 187161 DEBUG nova.virt.libvirt.host [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.479 187161 DEBUG nova.virt.libvirt.host [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.482 187161 DEBUG nova.virt.libvirt.host [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.483 187161 DEBUG nova.virt.libvirt.host [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.484 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.484 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.485 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.485 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.486 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.486 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.486 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.486 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.487 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.487 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.487 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.487 187161 DEBUG nova.virt.hardware [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.491 187161 DEBUG nova.virt.libvirt.vif [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:08:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-692236767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-692',id=19,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-f791gp5t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:08:48Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=1e3e5721-9cda-4368-b5c4-c6a8d4d8db95,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.492 187161 DEBUG nova.network.os_vif_util [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.493 187161 DEBUG nova.network.os_vif_util [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.494 187161 DEBUG nova.objects.instance [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:08:54 compute-1 nova_compute[187157]: 2025-12-03 00:08:54.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.005 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <uuid>1e3e5721-9cda-4368-b5c4-c6a8d4d8db95</uuid>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <name>instance-00000013</name>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-692236767</nova:name>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:08:54</nova:creationTime>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:08:55 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:08:55 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         <nova:port uuid="8cea46ae-d25b-44ab-826a-ac08e1df95d3">
Dec 03 00:08:55 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <system>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <entry name="serial">1e3e5721-9cda-4368-b5c4-c6a8d4d8db95</entry>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <entry name="uuid">1e3e5721-9cda-4368-b5c4-c6a8d4d8db95</entry>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </system>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <os>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </os>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <features>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </features>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.config"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:32:d5:ce"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <target dev="tap8cea46ae-d2"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/console.log" append="off"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <video>
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </video>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:08:55 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:08:55 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:08:55 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:08:55 compute-1 nova_compute[187157]: </domain>
Dec 03 00:08:55 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.007 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Preparing to wait for external event network-vif-plugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.007 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.007 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.008 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.008 187161 DEBUG nova.virt.libvirt.vif [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:08:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-692236767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-692',id=19,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-f791gp5t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:08:48Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=1e3e5721-9cda-4368-b5c4-c6a8d4d8db95,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.008 187161 DEBUG nova.network.os_vif_util [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.009 187161 DEBUG nova.network.os_vif_util [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.009 187161 DEBUG os_vif [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.010 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.010 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.010 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.011 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.011 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '68ec6e1d-239c-513e-a799-6c5b47f446da', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.012 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.014 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.015 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.015 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cea46ae-d2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.016 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8cea46ae-d2, col_values=(('qos', UUID('51ef5859-aea8-4949-8b89-7cb128f338cc')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.016 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8cea46ae-d2, col_values=(('external_ids', {'iface-id': '8cea46ae-d25b-44ab-826a-ac08e1df95d3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:d5:ce', 'vm-uuid': '1e3e5721-9cda-4368-b5c4-c6a8d4d8db95'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.017 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 NetworkManager[55553]: <info>  [1764720535.0179] manager: (tap8cea46ae-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.019 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.024 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:55 compute-1 nova_compute[187157]: 2025-12-03 00:08:55.024 187161 INFO os_vif [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2')
Dec 03 00:08:56 compute-1 nova_compute[187157]: 2025-12-03 00:08:56.562 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:08:56 compute-1 nova_compute[187157]: 2025-12-03 00:08:56.563 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:08:56 compute-1 nova_compute[187157]: 2025-12-03 00:08:56.563 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No VIF found with MAC fa:16:3e:32:d5:ce, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:08:56 compute-1 nova_compute[187157]: 2025-12-03 00:08:56.564 187161 INFO nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Using config drive
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.073 187161 WARNING neutronclient.v2_0.client [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.607 187161 INFO nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Creating config drive at /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.config
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.614 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpkvrhb139 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.633 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.742 187161 DEBUG oslo_concurrency.processutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpkvrhb139" returned: 0 in 0.128s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:57 compute-1 kernel: tap8cea46ae-d2: entered promiscuous mode
Dec 03 00:08:57 compute-1 NetworkManager[55553]: <info>  [1764720537.8041] manager: (tap8cea46ae-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.805 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:57 compute-1 ovn_controller[95464]: 2025-12-03T00:08:57Z|00170|binding|INFO|Claiming lport 8cea46ae-d25b-44ab-826a-ac08e1df95d3 for this chassis.
Dec 03 00:08:57 compute-1 ovn_controller[95464]: 2025-12-03T00:08:57Z|00171|binding|INFO|8cea46ae-d25b-44ab-826a-ac08e1df95d3: Claiming fa:16:3e:32:d5:ce 10.100.0.9
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.816 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:d5:ce 10.100.0.9'], port_security=['fa:16:3e:32:d5:ce 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1e3e5721-9cda-4368-b5c4-c6a8d4d8db95', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=8cea46ae-d25b-44ab-826a-ac08e1df95d3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.817 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 8cea46ae-d25b-44ab-826a-ac08e1df95d3 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 bound to our chassis
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.818 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.824 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:57 compute-1 ovn_controller[95464]: 2025-12-03T00:08:57Z|00172|binding|INFO|Setting lport 8cea46ae-d25b-44ab-826a-ac08e1df95d3 ovn-installed in OVS
Dec 03 00:08:57 compute-1 ovn_controller[95464]: 2025-12-03T00:08:57Z|00173|binding|INFO|Setting lport 8cea46ae-d25b-44ab-826a-ac08e1df95d3 up in Southbound
Dec 03 00:08:57 compute-1 nova_compute[187157]: 2025-12-03 00:08:57.827 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.829 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc2a9e3-d6fa-45f6-9db1-3e4e88679012]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.830 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c6ad8f4-61 in ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.832 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c6ad8f4-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.832 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1a866c-d41a-4225-93de-1e0c38f83d88]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 systemd-udevd[215505]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.833 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[51f2aae2-90e9-4714-b62f-9f84c4168b40]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 systemd-machined[153454]: New machine qemu-15-instance-00000013.
Dec 03 00:08:57 compute-1 NetworkManager[55553]: <info>  [1764720537.8454] device (tap8cea46ae-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:08:57 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-00000013.
Dec 03 00:08:57 compute-1 NetworkManager[55553]: <info>  [1764720537.8462] device (tap8cea46ae-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.847 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[d885f9cf-0a77-44f0-8256-2458f506bd24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.864 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce31a51-3a9f-4bc8-9d3d-64f82f8c386f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.893 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[3d418183-5e3a-42f9-82b0-9a8698c53a89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.900 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[66a55707-4af9-46cf-8a79-e4a234f136c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 NetworkManager[55553]: <info>  [1764720537.9011] manager: (tap9c6ad8f4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Dec 03 00:08:57 compute-1 systemd-udevd[215508]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.930 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[b45b0b11-f60c-4da2-88a7-4adb2ce1bb24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.933 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[2de9e872-d7fc-40d5-a661-57876d6b6eff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 NetworkManager[55553]: <info>  [1764720537.9559] device (tap9c6ad8f4-60): carrier: link connected
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.962 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[49e3ea54-7d83-4691-b941-d7fdb224bfd2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.978 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa5b37d-24f9-44d4-9202-f07872929ba4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455761, 'reachable_time': 38494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215537, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:57.994 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5cc3c6-1d5b-4274-93d8-51b1b6e08f7a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:f806'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455761, 'tstamp': 455761}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215538, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.006 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ddc802-2b04-4da7-a199-25e9798e6e29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455761, 'reachable_time': 38494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215539, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.029 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[49b68535-da16-409b-b8c4-37f15ec86dbb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.055 187161 DEBUG nova.compute.manager [req-3668c969-289b-4616-9c79-77bac5378ad3 req-83591d87-ee0a-464a-8b98-b76eb55ac28a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-plugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.055 187161 DEBUG oslo_concurrency.lockutils [req-3668c969-289b-4616-9c79-77bac5378ad3 req-83591d87-ee0a-464a-8b98-b76eb55ac28a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.056 187161 DEBUG oslo_concurrency.lockutils [req-3668c969-289b-4616-9c79-77bac5378ad3 req-83591d87-ee0a-464a-8b98-b76eb55ac28a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.056 187161 DEBUG oslo_concurrency.lockutils [req-3668c969-289b-4616-9c79-77bac5378ad3 req-83591d87-ee0a-464a-8b98-b76eb55ac28a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.056 187161 DEBUG nova.compute.manager [req-3668c969-289b-4616-9c79-77bac5378ad3 req-83591d87-ee0a-464a-8b98-b76eb55ac28a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Processing event network-vif-plugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.086 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb92507-0d6b-49ed-bca4-068f4870368a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.088 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.088 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.088 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:58 compute-1 NetworkManager[55553]: <info>  [1764720538.0911] manager: (tap9c6ad8f4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec 03 00:08:58 compute-1 kernel: tap9c6ad8f4-60: entered promiscuous mode
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.091 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.094 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.095 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.097 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4715fa84-56d7-4a28-a54d-457045429bef]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.098 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.098 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.098 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.098 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.099 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf10a22-30f8-4212-8a9b-b1ca5ade2680]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.099 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.100 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5d05e8-8b09-4976-bdba-8eb630a3cf76]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:58 compute-1 ovn_controller[95464]: 2025-12-03T00:08:58Z|00174|binding|INFO|Releasing lport df9da247-f3c2-412c-95a4-9a2562c93dd4 from this chassis (sb_readonly=0)
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.100 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.101 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'env', 'PROCESS_TAG=haproxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.126 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:08:58.316 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:58 compute-1 podman[215576]: 2025-12-03 00:08:58.492152828 +0000 UTC m=+0.055375828 container create 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.514 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.518 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.522 187161 INFO nova.virt.libvirt.driver [-] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Instance spawned successfully.
Dec 03 00:08:58 compute-1 nova_compute[187157]: 2025-12-03 00:08:58.522 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:08:58 compute-1 systemd[1]: Started libpod-conmon-19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b.scope.
Dec 03 00:08:58 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:08:58 compute-1 podman[215576]: 2025-12-03 00:08:58.460044749 +0000 UTC m=+0.023267769 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:08:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3c254271f49ce3558a181d32817e60b137cd450a9162d6cc9deff9c20c053ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:08:58 compute-1 podman[215576]: 2025-12-03 00:08:58.567216568 +0000 UTC m=+0.130439588 container init 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:08:58 compute-1 podman[215576]: 2025-12-03 00:08:58.573477699 +0000 UTC m=+0.136700699 container start 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:08:58 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [NOTICE]   (215597) : New worker (215599) forked
Dec 03 00:08:58 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [NOTICE]   (215597) : Loading success.
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.034 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.035 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.035 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.035 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.036 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.036 187161 DEBUG nova.virt.libvirt.driver [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.546 187161 INFO nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Took 10.18 seconds to spawn the instance on the hypervisor.
Dec 03 00:08:59 compute-1 nova_compute[187157]: 2025-12-03 00:08:59.546 187161 DEBUG nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.019 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.087 187161 INFO nova.compute.manager [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Took 16.66 seconds to build instance.
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.140 187161 DEBUG nova.compute.manager [req-3e7d68f4-a44a-4400-9a12-5a84e75308d0 req-84e37871-3278-496d-ac34-fafb4b5381db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-plugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.140 187161 DEBUG oslo_concurrency.lockutils [req-3e7d68f4-a44a-4400-9a12-5a84e75308d0 req-84e37871-3278-496d-ac34-fafb4b5381db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.140 187161 DEBUG oslo_concurrency.lockutils [req-3e7d68f4-a44a-4400-9a12-5a84e75308d0 req-84e37871-3278-496d-ac34-fafb4b5381db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.141 187161 DEBUG oslo_concurrency.lockutils [req-3e7d68f4-a44a-4400-9a12-5a84e75308d0 req-84e37871-3278-496d-ac34-fafb4b5381db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.141 187161 DEBUG nova.compute.manager [req-3e7d68f4-a44a-4400-9a12-5a84e75308d0 req-84e37871-3278-496d-ac34-fafb4b5381db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] No waiting events found dispatching network-vif-plugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.141 187161 WARNING nova.compute.manager [req-3e7d68f4-a44a-4400-9a12-5a84e75308d0 req-84e37871-3278-496d-ac34-fafb4b5381db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received unexpected event network-vif-plugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 for instance with vm_state active and task_state None.
Dec 03 00:09:00 compute-1 nova_compute[187157]: 2025-12-03 00:09:00.594 187161 DEBUG oslo_concurrency.lockutils [None req-33e91b46-9a02-48d1-9b26-970803088343 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.177s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:01.728 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:01.729 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:01.729 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:02 compute-1 nova_compute[187157]: 2025-12-03 00:09:02.634 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:05 compute-1 nova_compute[187157]: 2025-12-03 00:09:05.023 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:05 compute-1 podman[215609]: 2025-12-03 00:09:05.224198767 +0000 UTC m=+0.059436836 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible)
Dec 03 00:09:05 compute-1 podman[197537]: time="2025-12-03T00:09:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:09:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:09:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:09:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:09:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3074 "" "Go-http-client/1.1"
Dec 03 00:09:07 compute-1 podman[215630]: 2025-12-03 00:09:07.205089507 +0000 UTC m=+0.049935867 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 00:09:07 compute-1 nova_compute[187157]: 2025-12-03 00:09:07.637 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:10 compute-1 nova_compute[187157]: 2025-12-03 00:09:10.028 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:11 compute-1 ovn_controller[95464]: 2025-12-03T00:09:11Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:d5:ce 10.100.0.9
Dec 03 00:09:11 compute-1 ovn_controller[95464]: 2025-12-03T00:09:11Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:d5:ce 10.100.0.9
Dec 03 00:09:12 compute-1 nova_compute[187157]: 2025-12-03 00:09:12.674 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:12 compute-1 nova_compute[187157]: 2025-12-03 00:09:12.997 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Creating tmpfile /var/lib/nova/instances/tmp5i5180fs to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:09:12 compute-1 nova_compute[187157]: 2025-12-03 00:09:12.998 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:13 compute-1 nova_compute[187157]: 2025-12-03 00:09:13.065 187161 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:09:15 compute-1 nova_compute[187157]: 2025-12-03 00:09:15.032 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:16 compute-1 podman[215668]: 2025-12-03 00:09:16.204413473 +0000 UTC m=+0.046619248 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:09:16 compute-1 nova_compute[187157]: 2025-12-03 00:09:16.295 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:17 compute-1 nova_compute[187157]: 2025-12-03 00:09:17.676 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: ERROR   00:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: ERROR   00:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: ERROR   00:09:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: ERROR   00:09:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: ERROR   00:09:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:09:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:09:20 compute-1 nova_compute[187157]: 2025-12-03 00:09:20.034 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:21 compute-1 nova_compute[187157]: 2025-12-03 00:09:21.455 187161 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='13917c6d-537d-4b86-a989-9ce2df414798',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:09:22 compute-1 podman[215692]: 2025-12-03 00:09:22.230527647 +0000 UTC m=+0.075097152 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:09:22 compute-1 nova_compute[187157]: 2025-12-03 00:09:22.678 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:23 compute-1 nova_compute[187157]: 2025-12-03 00:09:23.474 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:09:23 compute-1 nova_compute[187157]: 2025-12-03 00:09:23.475 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:09:23 compute-1 nova_compute[187157]: 2025-12-03 00:09:23.476 187161 DEBUG nova.network.neutron [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:09:24 compute-1 nova_compute[187157]: 2025-12-03 00:09:24.086 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:24 compute-1 podman[215716]: 2025-12-03 00:09:24.208096418 +0000 UTC m=+0.047445139 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:09:24 compute-1 nova_compute[187157]: 2025-12-03 00:09:24.882 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:25 compute-1 nova_compute[187157]: 2025-12-03 00:09:25.016 187161 DEBUG nova.network.neutron [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:09:25 compute-1 nova_compute[187157]: 2025-12-03 00:09:25.038 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:26 compute-1 nova_compute[187157]: 2025-12-03 00:09:26.291 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:09:26 compute-1 nova_compute[187157]: 2025-12-03 00:09:26.613 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='13917c6d-537d-4b86-a989-9ce2df414798',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:09:26 compute-1 nova_compute[187157]: 2025-12-03 00:09:26.613 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Creating instance directory: /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:09:26 compute-1 nova_compute[187157]: 2025-12-03 00:09:26.614 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Creating disk.info with the contents: {'/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk': 'qcow2', '/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:09:26 compute-1 nova_compute[187157]: 2025-12-03 00:09:26.614 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:09:26 compute-1 nova_compute[187157]: 2025-12-03 00:09:26.614 187161 DEBUG nova.objects.instance [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 13917c6d-537d-4b86-a989-9ce2df414798 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.247 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.251 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.253 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.328 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.329 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.329 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.330 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.333 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.333 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.394 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.395 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.452 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.453 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.453 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.502 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.502 187161 DEBUG nova.virt.disk.api [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.503 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.552 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.552 187161 DEBUG nova.virt.disk.api [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.553 187161 DEBUG nova.objects.instance [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 13917c6d-537d-4b86-a989-9ce2df414798 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:09:27 compute-1 nova_compute[187157]: 2025-12-03 00:09:27.681 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:27 compute-1 ovn_controller[95464]: 2025-12-03T00:09:27Z|00175|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.222 187161 DEBUG nova.objects.base [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<13917c6d-537d-4b86-a989-9ce2df414798> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.222 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.244 187161 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config 497664" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.245 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.246 187161 DEBUG nova.virt.libvirt.vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:08:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-476',id=18,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:08:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-r72zwvbi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:08:37Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=13917c6d-537d-4b86-a989-9ce2df414798,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.246 187161 DEBUG nova.network.os_vif_util [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.247 187161 DEBUG nova.network.os_vif_util [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.248 187161 DEBUG os_vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.248 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.249 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.249 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.250 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.250 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e6dfa855-c9c5-587c-84a9-9e8d9bb572d4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.251 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.252 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.254 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.255 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08bf4d8e-df, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.255 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap08bf4d8e-df, col_values=(('qos', UUID('2b5a09e6-87eb-450e-bfcb-db3e220cab28')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.255 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap08bf4d8e-df, col_values=(('external_ids', {'iface-id': '08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:5d:ed', 'vm-uuid': '13917c6d-537d-4b86-a989-9ce2df414798'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.257 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 NetworkManager[55553]: <info>  [1764720569.2581] manager: (tap08bf4d8e-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.259 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.263 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.264 187161 INFO os_vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df')
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.265 187161 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.265 187161 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='13917c6d-537d-4b86-a989-9ce2df414798',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.266 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:29 compute-1 nova_compute[187157]: 2025-12-03 00:09:29.774 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:31 compute-1 nova_compute[187157]: 2025-12-03 00:09:31.984 187161 DEBUG nova.network.neutron [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:09:31 compute-1 nova_compute[187157]: 2025-12-03 00:09:31.998 187161 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='13917c6d-537d-4b86-a989-9ce2df414798',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:09:32 compute-1 nova_compute[187157]: 2025-12-03 00:09:32.723 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:34 compute-1 nova_compute[187157]: 2025-12-03 00:09:34.258 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:35 compute-1 podman[197537]: time="2025-12-03T00:09:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:09:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:09:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:09:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:09:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3081 "" "Go-http-client/1.1"
Dec 03 00:09:35 compute-1 nova_compute[187157]: 2025-12-03 00:09:35.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:35 compute-1 podman[215756]: 2025-12-03 00:09:35.722528672 +0000 UTC m=+0.055575883 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:09:35 compute-1 kernel: tap08bf4d8e-df: entered promiscuous mode
Dec 03 00:09:35 compute-1 NetworkManager[55553]: <info>  [1764720575.7329] manager: (tap08bf4d8e-df): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Dec 03 00:09:35 compute-1 ovn_controller[95464]: 2025-12-03T00:09:35Z|00176|binding|INFO|Claiming lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for this additional chassis.
Dec 03 00:09:35 compute-1 ovn_controller[95464]: 2025-12-03T00:09:35Z|00177|binding|INFO|08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb: Claiming fa:16:3e:03:5d:ed 10.100.0.12
Dec 03 00:09:35 compute-1 nova_compute[187157]: 2025-12-03 00:09:35.734 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:35 compute-1 ovn_controller[95464]: 2025-12-03T00:09:35Z|00178|binding|INFO|Setting lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb ovn-installed in OVS
Dec 03 00:09:35 compute-1 nova_compute[187157]: 2025-12-03 00:09:35.746 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:35 compute-1 nova_compute[187157]: 2025-12-03 00:09:35.748 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:35 compute-1 systemd-udevd[215791]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:09:35 compute-1 systemd-machined[153454]: New machine qemu-16-instance-00000012.
Dec 03 00:09:35 compute-1 NetworkManager[55553]: <info>  [1764720575.7750] device (tap08bf4d8e-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:09:35 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000012.
Dec 03 00:09:35 compute-1 NetworkManager[55553]: <info>  [1764720575.7765] device (tap08bf4d8e-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.242 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5d:ed 10.100.0.12'], port_security=['fa:16:3e:03:5d:ed 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '13917c6d-537d-4b86-a989-9ce2df414798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '10', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.244 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.245 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.266 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3d35604b-2083-4e24-96d7-4d836f8cbd22]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.295 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[e451fd2b-4116-4823-989c-2a2e364fd7e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.298 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[33cc73a1-875c-4557-bd12-9faf34b043be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.328 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[644135df-2c06-4484-8312-cb986cede3fd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.350 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b9317e-49ba-4d46-bd2e-9aa36d9d94a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455761, 'reachable_time': 38494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215812, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.373 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[016669c1-c0d3-4ba4-b28c-613e57b0cdf1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455770, 'tstamp': 455770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215813, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455773, 'tstamp': 455773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215813, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.374 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:36 compute-1 nova_compute[187157]: 2025-12-03 00:09:36.376 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.378 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.378 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.378 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.379 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:09:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:36.381 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[68e1dd88-719f-43b9-a2f8-2ab0680baf1a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:37 compute-1 nova_compute[187157]: 2025-12-03 00:09:37.724 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-1 podman[215822]: 2025-12-03 00:09:38.238048581 +0000 UTC m=+0.072776706 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 03 00:09:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:39.099 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:09:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:39.100 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:09:39 compute-1 nova_compute[187157]: 2025-12-03 00:09:39.101 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-1 ovn_controller[95464]: 2025-12-03T00:09:39Z|00179|binding|INFO|Claiming lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for this chassis.
Dec 03 00:09:39 compute-1 ovn_controller[95464]: 2025-12-03T00:09:39Z|00180|binding|INFO|08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb: Claiming fa:16:3e:03:5d:ed 10.100.0.12
Dec 03 00:09:39 compute-1 ovn_controller[95464]: 2025-12-03T00:09:39Z|00181|binding|INFO|Setting lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb up in Southbound
Dec 03 00:09:39 compute-1 nova_compute[187157]: 2025-12-03 00:09:39.260 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:40 compute-1 nova_compute[187157]: 2025-12-03 00:09:40.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.277 187161 INFO nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Post operation of migration started
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.278 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.391 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.391 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.501 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.502 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:09:41 compute-1 nova_compute[187157]: 2025-12-03 00:09:41.502 187161 DEBUG nova.network.neutron [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:09:42 compute-1 nova_compute[187157]: 2025-12-03 00:09:42.150 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:42 compute-1 nova_compute[187157]: 2025-12-03 00:09:42.543 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:42 compute-1 nova_compute[187157]: 2025-12-03 00:09:42.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:42 compute-1 nova_compute[187157]: 2025-12-03 00:09:42.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:42 compute-1 nova_compute[187157]: 2025-12-03 00:09:42.753 187161 DEBUG nova.network.neutron [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:09:42 compute-1 nova_compute[187157]: 2025-12-03 00:09:42.772 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.239 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.240 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.240 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.240 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.268 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.820 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.821 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.821 187161 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:43 compute-1 nova_compute[187157]: 2025-12-03 00:09:43.825 187161 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:09:43 compute-1 virtqemud[186882]: Domain id=16 name='instance-00000012' uuid=13917c6d-537d-4b86-a989-9ce2df414798 is tainted: custom-monitor
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.263 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.331 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.383 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.384 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.432 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.436 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.486 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.486 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.547 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.717 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.718 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.736 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.737 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5512MB free_disk=73.1082534790039GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.737 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.737 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:44 compute-1 nova_compute[187157]: 2025-12-03 00:09:44.832 187161 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:09:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:45.102 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:45 compute-1 nova_compute[187157]: 2025-12-03 00:09:45.838 187161 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:09:45 compute-1 nova_compute[187157]: 2025-12-03 00:09:45.841 187161 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:09:47 compute-1 podman[215860]: 2025-12-03 00:09:47.204682412 +0000 UTC m=+0.046153587 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:09:47 compute-1 nova_compute[187157]: 2025-12-03 00:09:47.775 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:48 compute-1 nova_compute[187157]: 2025-12-03 00:09:48.859 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 13917c6d-537d-4b86-a989-9ce2df414798 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:09:49 compute-1 nova_compute[187157]: 2025-12-03 00:09:49.208 187161 DEBUG nova.objects.instance [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:09:49 compute-1 nova_compute[187157]: 2025-12-03 00:09:49.265 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:49 compute-1 nova_compute[187157]: 2025-12-03 00:09:49.381 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating resource usage from migration 0843730c-83f4-417c-a99b-29298db49e9e
Dec 03 00:09:49 compute-1 nova_compute[187157]: 2025-12-03 00:09:49.382 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Starting to track incoming migration 0843730c-83f4-417c-a99b-29298db49e9e with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: ERROR   00:09:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: ERROR   00:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: ERROR   00:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: ERROR   00:09:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: ERROR   00:09:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:09:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:09:49 compute-1 nova_compute[187157]: 2025-12-03 00:09:49.955 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.245 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.461 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 13917c6d-537d-4b86-a989-9ce2df414798 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.461 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.462 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:09:44 up  1:16,  0 user,  load average: 0.48, 0.46, 0.41\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.549 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.606 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:50 compute-1 nova_compute[187157]: 2025-12-03 00:09:50.607 187161 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:51 compute-1 nova_compute[187157]: 2025-12-03 00:09:51.058 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:09:51 compute-1 nova_compute[187157]: 2025-12-03 00:09:51.594 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:09:51 compute-1 nova_compute[187157]: 2025-12-03 00:09:51.596 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.858s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:52 compute-1 nova_compute[187157]: 2025-12-03 00:09:52.776 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:53 compute-1 podman[215885]: 2025-12-03 00:09:53.27562403 +0000 UTC m=+0.113351838 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.324 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.325 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.325 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.325 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.325 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.336 187161 INFO nova.compute.manager [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Terminating instance
Dec 03 00:09:53 compute-1 nova_compute[187157]: 2025-12-03 00:09:53.854 187161 DEBUG nova.compute.manager [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.266 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:54 compute-1 kernel: tap8cea46ae-d2 (unregistering): left promiscuous mode
Dec 03 00:09:54 compute-1 NetworkManager[55553]: <info>  [1764720594.7276] device (tap8cea46ae-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.788 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:54 compute-1 ovn_controller[95464]: 2025-12-03T00:09:54Z|00182|binding|INFO|Releasing lport 8cea46ae-d25b-44ab-826a-ac08e1df95d3 from this chassis (sb_readonly=0)
Dec 03 00:09:54 compute-1 ovn_controller[95464]: 2025-12-03T00:09:54Z|00183|binding|INFO|Setting lport 8cea46ae-d25b-44ab-826a-ac08e1df95d3 down in Southbound
Dec 03 00:09:54 compute-1 ovn_controller[95464]: 2025-12-03T00:09:54Z|00184|binding|INFO|Removing iface tap8cea46ae-d2 ovn-installed in OVS
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.790 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.803 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:54 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec 03 00:09:54 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000013.scope: Consumed 15.342s CPU time.
Dec 03 00:09:54 compute-1 systemd-machined[153454]: Machine qemu-15-instance-00000013 terminated.
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.874 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.878 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:54 compute-1 podman[215913]: 2025-12-03 00:09:54.906267814 +0000 UTC m=+0.084827894 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.916 187161 INFO nova.virt.libvirt.driver [-] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Instance destroyed successfully.
Dec 03 00:09:54 compute-1 nova_compute[187157]: 2025-12-03 00:09:54.916 187161 DEBUG nova.objects.instance [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'resources' on Instance uuid 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:09:54 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:54.974 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:d5:ce 10.100.0.9'], port_security=['fa:16:3e:32:d5:ce 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1e3e5721-9cda-4368-b5c4-c6a8d4d8db95', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '5', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=8cea46ae-d25b-44ab-826a-ac08e1df95d3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:09:54 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:54.975 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 8cea46ae-d25b-44ab-826a-ac08e1df95d3 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:09:54 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:54.976 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:09:54 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:54.988 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[deab4213-f34b-42c6-ae88-086601bef979]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.011 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[135b9e45-a8e5-42e1-83d2-4d366f818d6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.012 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5ff5c7-d033-46cd-b04d-66cfa6e90ea0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.035 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7f551f-4ccc-45b6-90f3-46d9df3dfc53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.049 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e800e71f-03ab-4b7d-b91c-27f485af05e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455761, 'reachable_time': 38494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215959, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.061 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[563ce86e-f293-4dea-887f-93909e24db2b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455770, 'tstamp': 455770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215960, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9c6ad8f4-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455773, 'tstamp': 455773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215960, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.062 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.063 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.068 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.068 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.068 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.068 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.068 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:09:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:09:55.069 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2c87fee5-bf67-48c9-8193-b184dd2bd2d0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.108 187161 DEBUG nova.compute.manager [req-2cf64584-3efa-457a-9900-cf6a2e8fc885 req-f1a3e260-d139-4750-9d4b-40903ff29513 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-unplugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.108 187161 DEBUG oslo_concurrency.lockutils [req-2cf64584-3efa-457a-9900-cf6a2e8fc885 req-f1a3e260-d139-4750-9d4b-40903ff29513 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.109 187161 DEBUG oslo_concurrency.lockutils [req-2cf64584-3efa-457a-9900-cf6a2e8fc885 req-f1a3e260-d139-4750-9d4b-40903ff29513 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.109 187161 DEBUG oslo_concurrency.lockutils [req-2cf64584-3efa-457a-9900-cf6a2e8fc885 req-f1a3e260-d139-4750-9d4b-40903ff29513 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.109 187161 DEBUG nova.compute.manager [req-2cf64584-3efa-457a-9900-cf6a2e8fc885 req-f1a3e260-d139-4750-9d4b-40903ff29513 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] No waiting events found dispatching network-vif-unplugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.109 187161 DEBUG nova.compute.manager [req-2cf64584-3efa-457a-9900-cf6a2e8fc885 req-f1a3e260-d139-4750-9d4b-40903ff29513 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-unplugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.474 187161 DEBUG nova.virt.libvirt.vif [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:08:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-692236767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-692',id=19,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:08:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-f791gp5t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:08:59Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=1e3e5721-9cda-4368-b5c4-c6a8d4d8db95,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.474 187161 DEBUG nova.network.os_vif_util [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "address": "fa:16:3e:32:d5:ce", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cea46ae-d2", "ovs_interfaceid": "8cea46ae-d25b-44ab-826a-ac08e1df95d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.475 187161 DEBUG nova.network.os_vif_util [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.475 187161 DEBUG os_vif [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.477 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.477 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cea46ae-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.479 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.480 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.481 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.481 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=51ef5859-aea8-4949-8b89-7cb128f338cc) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.481 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.482 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.484 187161 INFO os_vif [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:d5:ce,bridge_name='br-int',has_traffic_filtering=True,id=8cea46ae-d25b-44ab-826a-ac08e1df95d3,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cea46ae-d2')
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.484 187161 INFO nova.virt.libvirt.driver [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Deleting instance files /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95_del
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.485 187161 INFO nova.virt.libvirt.driver [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Deletion of /var/lib/nova/instances/1e3e5721-9cda-4368-b5c4-c6a8d4d8db95_del complete
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.592 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.592 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.592 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.593 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.593 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.996 187161 INFO nova.compute.manager [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Took 2.14 seconds to destroy the instance on the hypervisor.
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.997 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.997 187161 DEBUG nova.compute.manager [-] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.997 187161 DEBUG nova.network.neutron [-] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:09:55 compute-1 nova_compute[187157]: 2025-12-03 00:09:55.997 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:56 compute-1 nova_compute[187157]: 2025-12-03 00:09:56.629 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.377 187161 DEBUG nova.compute.manager [req-33697023-97fe-4aee-8f1e-ceb1f366504c req-78665073-4a23-48a5-b3d1-2140c84839dd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-unplugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.378 187161 DEBUG oslo_concurrency.lockutils [req-33697023-97fe-4aee-8f1e-ceb1f366504c req-78665073-4a23-48a5-b3d1-2140c84839dd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.378 187161 DEBUG oslo_concurrency.lockutils [req-33697023-97fe-4aee-8f1e-ceb1f366504c req-78665073-4a23-48a5-b3d1-2140c84839dd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.378 187161 DEBUG oslo_concurrency.lockutils [req-33697023-97fe-4aee-8f1e-ceb1f366504c req-78665073-4a23-48a5-b3d1-2140c84839dd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.378 187161 DEBUG nova.compute.manager [req-33697023-97fe-4aee-8f1e-ceb1f366504c req-78665073-4a23-48a5-b3d1-2140c84839dd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] No waiting events found dispatching network-vif-unplugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.379 187161 DEBUG nova.compute.manager [req-33697023-97fe-4aee-8f1e-ceb1f366504c req-78665073-4a23-48a5-b3d1-2140c84839dd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-unplugged-8cea46ae-d25b-44ab-826a-ac08e1df95d3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.639 187161 DEBUG nova.compute.manager [req-087d910a-3cfb-4029-b1f3-62de08e32fdc req-20f537e6-b24e-415d-8111-fabb386f702d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Received event network-vif-deleted-8cea46ae-d25b-44ab-826a-ac08e1df95d3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.640 187161 INFO nova.compute.manager [req-087d910a-3cfb-4029-b1f3-62de08e32fdc req-20f537e6-b24e-415d-8111-fabb386f702d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Neutron deleted interface 8cea46ae-d25b-44ab-826a-ac08e1df95d3; detaching it from the instance and deleting it from the info cache
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.640 187161 DEBUG nova.network.neutron [req-087d910a-3cfb-4029-b1f3-62de08e32fdc req-20f537e6-b24e-415d-8111-fabb386f702d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:09:57 compute-1 nova_compute[187157]: 2025-12-03 00:09:57.778 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:58 compute-1 nova_compute[187157]: 2025-12-03 00:09:58.088 187161 DEBUG nova.network.neutron [-] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:09:58 compute-1 nova_compute[187157]: 2025-12-03 00:09:58.146 187161 DEBUG nova.compute.manager [req-087d910a-3cfb-4029-b1f3-62de08e32fdc req-20f537e6-b24e-415d-8111-fabb386f702d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Detach interface failed, port_id=8cea46ae-d25b-44ab-826a-ac08e1df95d3, reason: Instance 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:09:58 compute-1 nova_compute[187157]: 2025-12-03 00:09:58.695 187161 INFO nova.compute.manager [-] [instance: 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95] Took 2.70 seconds to deallocate network for instance.
Dec 03 00:09:59 compute-1 nova_compute[187157]: 2025-12-03 00:09:59.212 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:59 compute-1 nova_compute[187157]: 2025-12-03 00:09:59.213 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:59 compute-1 nova_compute[187157]: 2025-12-03 00:09:59.266 187161 DEBUG nova.compute.provider_tree [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:09:59 compute-1 nova_compute[187157]: 2025-12-03 00:09:59.778 187161 DEBUG nova.scheduler.client.report [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:10:00 compute-1 nova_compute[187157]: 2025-12-03 00:10:00.287 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.075s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:00 compute-1 nova_compute[187157]: 2025-12-03 00:10:00.309 187161 INFO nova.scheduler.client.report [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Deleted allocations for instance 1e3e5721-9cda-4368-b5c4-c6a8d4d8db95
Dec 03 00:10:00 compute-1 nova_compute[187157]: 2025-12-03 00:10:00.483 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:01 compute-1 nova_compute[187157]: 2025-12-03 00:10:01.339 187161 DEBUG oslo_concurrency.lockutils [None req-eb76834b-744d-4e95-9d14-3d52ee4052d5 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "1e3e5721-9cda-4368-b5c4-c6a8d4d8db95" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.014s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:01.730 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:01.730 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:01.730 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:02 compute-1 nova_compute[187157]: 2025-12-03 00:10:02.779 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.484 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:05 compute-1 podman[197537]: time="2025-12-03T00:10:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:10:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:10:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:10:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:10:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.875 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.876 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.876 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.876 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.877 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:05 compute-1 nova_compute[187157]: 2025-12-03 00:10:05.951 187161 INFO nova.compute.manager [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Terminating instance
Dec 03 00:10:06 compute-1 podman[215963]: 2025-12-03 00:10:06.222409944 +0000 UTC m=+0.058595634 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git)
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.500 187161 DEBUG nova.compute.manager [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:10:06 compute-1 kernel: tap08bf4d8e-df (unregistering): left promiscuous mode
Dec 03 00:10:06 compute-1 NetworkManager[55553]: <info>  [1764720606.5183] device (tap08bf4d8e-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:10:06 compute-1 ovn_controller[95464]: 2025-12-03T00:10:06Z|00185|binding|INFO|Releasing lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb from this chassis (sb_readonly=0)
Dec 03 00:10:06 compute-1 ovn_controller[95464]: 2025-12-03T00:10:06Z|00186|binding|INFO|Setting lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb down in Southbound
Dec 03 00:10:06 compute-1 ovn_controller[95464]: 2025-12-03T00:10:06Z|00187|binding|INFO|Removing iface tap08bf4d8e-df ovn-installed in OVS
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.524 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.526 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.539 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:06 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec 03 00:10:06 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000012.scope: Consumed 2.388s CPU time.
Dec 03 00:10:06 compute-1 systemd-machined[153454]: Machine qemu-16-instance-00000012 terminated.
Dec 03 00:10:06 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:06.652 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5d:ed 10.100.0.12'], port_security=['fa:16:3e:03:5d:ed 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '13917c6d-537d-4b86-a989-9ce2df414798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '14', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:10:06 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:06.652 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:10:06 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:06.653 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:10:06 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:06.653 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6dec67fe-b3ea-405c-881a-fc7554b67c75]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:06 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:06.654 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace which is not needed anymore
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.718 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.722 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.751 187161 INFO nova.virt.libvirt.driver [-] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance destroyed successfully.
Dec 03 00:10:06 compute-1 nova_compute[187157]: 2025-12-03 00:10:06.752 187161 DEBUG nova.objects.instance [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'resources' on Instance uuid 13917c6d-537d-4b86-a989-9ce2df414798 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:10:06 compute-1 podman[216013]: 2025-12-03 00:10:06.763667541 +0000 UTC m=+0.034081987 container kill 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 03 00:10:06 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [NOTICE]   (215597) : haproxy version is 3.0.5-8e879a5
Dec 03 00:10:06 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [NOTICE]   (215597) : path to executable is /usr/sbin/haproxy
Dec 03 00:10:06 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [WARNING]  (215597) : Exiting Master process...
Dec 03 00:10:06 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [ALERT]    (215597) : Current worker (215599) exited with code 143 (Terminated)
Dec 03 00:10:06 compute-1 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215593]: [WARNING]  (215597) : All workers exited. Exiting... (0)
Dec 03 00:10:06 compute-1 systemd[1]: libpod-19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b.scope: Deactivated successfully.
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.102 187161 DEBUG nova.compute.manager [req-62b21878-6856-451a-8d89-58dc73d6177c req-7bc9886b-016b-4de8-905d-d125578dda85 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.103 187161 DEBUG oslo_concurrency.lockutils [req-62b21878-6856-451a-8d89-58dc73d6177c req-7bc9886b-016b-4de8-905d-d125578dda85 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.103 187161 DEBUG oslo_concurrency.lockutils [req-62b21878-6856-451a-8d89-58dc73d6177c req-7bc9886b-016b-4de8-905d-d125578dda85 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.103 187161 DEBUG oslo_concurrency.lockutils [req-62b21878-6856-451a-8d89-58dc73d6177c req-7bc9886b-016b-4de8-905d-d125578dda85 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.104 187161 DEBUG nova.compute.manager [req-62b21878-6856-451a-8d89-58dc73d6177c req-7bc9886b-016b-4de8-905d-d125578dda85 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.104 187161 DEBUG nova.compute.manager [req-62b21878-6856-451a-8d89-58dc73d6177c req-7bc9886b-016b-4de8-905d-d125578dda85 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:10:07 compute-1 podman[216043]: 2025-12-03 00:10:07.13443684 +0000 UTC m=+0.351527398 container died 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:10:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b-userdata-shm.mount: Deactivated successfully.
Dec 03 00:10:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-c3c254271f49ce3558a181d32817e60b137cd450a9162d6cc9deff9c20c053ec-merged.mount: Deactivated successfully.
Dec 03 00:10:07 compute-1 podman[216043]: 2025-12-03 00:10:07.268973526 +0000 UTC m=+0.486064064 container cleanup 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:10:07 compute-1 systemd[1]: libpod-conmon-19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b.scope: Deactivated successfully.
Dec 03 00:10:07 compute-1 podman[216056]: 2025-12-03 00:10:07.430898188 +0000 UTC m=+0.309939592 container remove 19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.436 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3513dcfa-ff83-4f86-bad8-e70d30335842]: (4, ("Wed Dec  3 12:10:06 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b)\n19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b\nWed Dec  3 12:10:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b)\n19bc0f10c37b0e410d44a560ca9de7ec537a2afef6c3465e3c9db7aa684f486b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.438 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9906a81e-5677-45ea-af40-17951b68df38]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.438 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.439 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ecebc8-1ce8-43b1-831f-45bbf67ca75a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.440 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.441 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 kernel: tap9c6ad8f4-60: left promiscuous mode
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.454 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.457 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.459 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b126c7d1-d762-46ce-a748-18a2c7076bea]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.474 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6b74948e-1b9b-4536-ad99-044bc5d7e66d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.475 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e92acd36-003e-4dbe-aaf9-ec0af6903709]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.489 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[379ab159-df6f-46eb-8aa0-277e0e4eff9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455754, 'reachable_time': 16072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216079, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.490 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:10:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:07.491 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[eda5ca54-001b-4602-9f24-1b403d544b18]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:07 compute-1 systemd[1]: run-netns-ovnmeta\x2d9c6ad8f4\x2d62a9\x2d4a0d\x2dac57\x2de980ee855c68.mount: Deactivated successfully.
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.587 187161 DEBUG nova.virt.libvirt.vif [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:08:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-476',id=18,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:08:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-r72zwvbi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:09:49Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=13917c6d-537d-4b86-a989-9ce2df414798,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.588 187161 DEBUG nova.network.os_vif_util [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.588 187161 DEBUG nova.network.os_vif_util [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.589 187161 DEBUG os_vif [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.590 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.590 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08bf4d8e-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.623 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.625 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.625 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.626 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2b5a09e6-87eb-450e-bfcb-db3e220cab28) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.626 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.627 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.628 187161 INFO os_vif [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df')
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.629 187161 INFO nova.virt.libvirt.driver [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Deleting instance files /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798_del
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.630 187161 INFO nova.virt.libvirt.driver [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Deletion of /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798_del complete
Dec 03 00:10:07 compute-1 nova_compute[187157]: 2025-12-03 00:10:07.781 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:08 compute-1 nova_compute[187157]: 2025-12-03 00:10:08.295 187161 INFO nova.compute.manager [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Took 1.79 seconds to destroy the instance on the hypervisor.
Dec 03 00:10:08 compute-1 nova_compute[187157]: 2025-12-03 00:10:08.295 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:10:08 compute-1 nova_compute[187157]: 2025-12-03 00:10:08.296 187161 DEBUG nova.compute.manager [-] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:10:08 compute-1 nova_compute[187157]: 2025-12-03 00:10:08.296 187161 DEBUG nova.network.neutron [-] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:10:08 compute-1 nova_compute[187157]: 2025-12-03 00:10:08.296 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:10:08 compute-1 nova_compute[187157]: 2025-12-03 00:10:08.629 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.159 187161 DEBUG nova.compute.manager [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.160 187161 DEBUG oslo_concurrency.lockutils [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.160 187161 DEBUG oslo_concurrency.lockutils [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.160 187161 DEBUG oslo_concurrency.lockutils [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.160 187161 DEBUG nova.compute.manager [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.161 187161 DEBUG nova.compute.manager [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.161 187161 DEBUG nova.compute.manager [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-deleted-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.161 187161 INFO nova.compute.manager [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Neutron deleted interface 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb; detaching it from the instance and deleting it from the info cache
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.161 187161 DEBUG nova.network.neutron [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:10:09 compute-1 podman[216081]: 2025-12-03 00:10:09.237505261 +0000 UTC m=+0.083788240 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.479 187161 DEBUG nova.network.neutron [-] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:10:09 compute-1 nova_compute[187157]: 2025-12-03 00:10:09.698 187161 DEBUG nova.compute.manager [req-a0cee7e5-1929-4f3b-a37a-1489a11c0ed0 req-6900370c-9f2d-4e03-93fb-b1afc851b791 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Detach interface failed, port_id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb, reason: Instance 13917c6d-537d-4b86-a989-9ce2df414798 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:10:10 compute-1 nova_compute[187157]: 2025-12-03 00:10:10.006 187161 INFO nova.compute.manager [-] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Took 1.71 seconds to deallocate network for instance.
Dec 03 00:10:10 compute-1 nova_compute[187157]: 2025-12-03 00:10:10.585 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:10 compute-1 nova_compute[187157]: 2025-12-03 00:10:10.586 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:10 compute-1 nova_compute[187157]: 2025-12-03 00:10:10.592 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:10 compute-1 nova_compute[187157]: 2025-12-03 00:10:10.646 187161 INFO nova.scheduler.client.report [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Deleted allocations for instance 13917c6d-537d-4b86-a989-9ce2df414798
Dec 03 00:10:12 compute-1 nova_compute[187157]: 2025-12-03 00:10:12.305 187161 DEBUG oslo_concurrency.lockutils [None req-0b8c6ed6-d275-4d22-afa9-4f50629898e3 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.429s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:12 compute-1 nova_compute[187157]: 2025-12-03 00:10:12.626 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:12 compute-1 nova_compute[187157]: 2025-12-03 00:10:12.783 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:17 compute-1 nova_compute[187157]: 2025-12-03 00:10:17.627 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:17 compute-1 nova_compute[187157]: 2025-12-03 00:10:17.782 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:17 compute-1 nova_compute[187157]: 2025-12-03 00:10:17.784 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:18 compute-1 podman[216101]: 2025-12-03 00:10:18.199123682 +0000 UTC m=+0.044811225 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: ERROR   00:10:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: ERROR   00:10:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: ERROR   00:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: ERROR   00:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: ERROR   00:10:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:10:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:10:22 compute-1 nova_compute[187157]: 2025-12-03 00:10:22.628 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:22 compute-1 nova_compute[187157]: 2025-12-03 00:10:22.786 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:24 compute-1 podman[216126]: 2025-12-03 00:10:24.228328059 +0000 UTC m=+0.074329222 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:10:25 compute-1 podman[216153]: 2025-12-03 00:10:25.20224219 +0000 UTC m=+0.048017021 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:10:27 compute-1 nova_compute[187157]: 2025-12-03 00:10:27.629 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:27 compute-1 nova_compute[187157]: 2025-12-03 00:10:27.788 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:32 compute-1 nova_compute[187157]: 2025-12-03 00:10:32.630 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:32 compute-1 nova_compute[187157]: 2025-12-03 00:10:32.791 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:35 compute-1 podman[197537]: time="2025-12-03T00:10:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:10:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:10:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:10:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:10:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2614 "" "Go-http-client/1.1"
Dec 03 00:10:36 compute-1 nova_compute[187157]: 2025-12-03 00:10:36.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:37 compute-1 podman[216174]: 2025-12-03 00:10:37.208697429 +0000 UTC m=+0.054357055 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:10:37 compute-1 nova_compute[187157]: 2025-12-03 00:10:37.631 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:37 compute-1 nova_compute[187157]: 2025-12-03 00:10:37.792 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:40 compute-1 podman[216196]: 2025-12-03 00:10:40.249580912 +0000 UTC m=+0.080021969 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 03 00:10:42 compute-1 nova_compute[187157]: 2025-12-03 00:10:42.634 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:42 compute-1 nova_compute[187157]: 2025-12-03 00:10:42.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:42 compute-1 nova_compute[187157]: 2025-12-03 00:10:42.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:42 compute-1 nova_compute[187157]: 2025-12-03 00:10:42.834 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.149 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:43.150 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:10:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:43.151 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.558 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.559 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.559 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.559 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.731 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.732 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.752 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.753 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5813MB free_disk=73.16611099243164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.753 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:43 compute-1 nova_compute[187157]: 2025-12-03 00:10:43.754 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:44 compute-1 nova_compute[187157]: 2025-12-03 00:10:44.971 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:10:44 compute-1 nova_compute[187157]: 2025-12-03 00:10:44.972 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:10:43 up  1:17,  0 user,  load average: 0.18, 0.38, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:10:45 compute-1 nova_compute[187157]: 2025-12-03 00:10:45.095 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:10:45 compute-1 nova_compute[187157]: 2025-12-03 00:10:45.808 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:10:46 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:46.769 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:a0:ba 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05a149bd8b504e438531bb5b9409e4db', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c) old=Port_Binding(mac=['fa:16:3e:63:a0:ba'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05a149bd8b504e438531bb5b9409e4db', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:10:46 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:46.770 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c in datapath 44651134-dca8-45c2-963a-1f17aac67593 updated
Dec 03 00:10:46 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:46.771 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44651134-dca8-45c2-963a-1f17aac67593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:10:46 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:46.772 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cbbcd47f-812d-4d9a-8e36-23a501be7087]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:47 compute-1 nova_compute[187157]: 2025-12-03 00:10:47.035 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:10:47 compute-1 nova_compute[187157]: 2025-12-03 00:10:47.035 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.281s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:47 compute-1 nova_compute[187157]: 2025-12-03 00:10:47.669 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:47 compute-1 nova_compute[187157]: 2025-12-03 00:10:47.834 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:10:48.153 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:10:49 compute-1 nova_compute[187157]: 2025-12-03 00:10:49.035 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:49 compute-1 nova_compute[187157]: 2025-12-03 00:10:49.036 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:49 compute-1 podman[216219]: 2025-12-03 00:10:49.209402029 +0000 UTC m=+0.055538592 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: ERROR   00:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: ERROR   00:10:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: ERROR   00:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: ERROR   00:10:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: ERROR   00:10:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:10:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:10:52 compute-1 nova_compute[187157]: 2025-12-03 00:10:52.671 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:52 compute-1 nova_compute[187157]: 2025-12-03 00:10:52.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:52 compute-1 nova_compute[187157]: 2025-12-03 00:10:52.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:10:52 compute-1 nova_compute[187157]: 2025-12-03 00:10:52.837 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:54 compute-1 nova_compute[187157]: 2025-12-03 00:10:54.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:55 compute-1 podman[216243]: 2025-12-03 00:10:55.235999965 +0000 UTC m=+0.078839942 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:10:55 compute-1 podman[216269]: 2025-12-03 00:10:55.310275066 +0000 UTC m=+0.047357027 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 03 00:10:55 compute-1 nova_compute[187157]: 2025-12-03 00:10:55.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:56 compute-1 nova_compute[187157]: 2025-12-03 00:10:56.278 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:57 compute-1 ovn_controller[95464]: 2025-12-03T00:10:57Z|00188|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Dec 03 00:10:57 compute-1 nova_compute[187157]: 2025-12-03 00:10:57.674 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:57 compute-1 nova_compute[187157]: 2025-12-03 00:10:57.839 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.731 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.731 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.731 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.955 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:3f:ab 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06e69b18-9531-430d-9cad-90848fbfd86e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e5672078-29f7-493b-a2d0-b68ca62fdf76) old=Port_Binding(mac=['fa:16:3e:7c:3f:ab'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.956 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e5672078-29f7-493b-a2d0-b68ca62fdf76 in datapath a4a14d90-e145-46fe-ae48-a3de49800b87 updated
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.957 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4a14d90-e145-46fe-ae48-a3de49800b87, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:11:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:01.958 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[29ba9390-c77c-4e07-93b1-5846a05ac2b8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:02 compute-1 nova_compute[187157]: 2025-12-03 00:11:02.675 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:02 compute-1 nova_compute[187157]: 2025-12-03 00:11:02.840 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:05 compute-1 podman[197537]: time="2025-12-03T00:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:11:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:11:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:11:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec 03 00:11:07 compute-1 nova_compute[187157]: 2025-12-03 00:11:07.678 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:07 compute-1 nova_compute[187157]: 2025-12-03 00:11:07.842 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:08 compute-1 podman[216290]: 2025-12-03 00:11:08.227547323 +0000 UTC m=+0.066406833 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 03 00:11:11 compute-1 podman[216311]: 2025-12-03 00:11:11.217321052 +0000 UTC m=+0.054736093 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4)
Dec 03 00:11:12 compute-1 nova_compute[187157]: 2025-12-03 00:11:12.679 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:12 compute-1 nova_compute[187157]: 2025-12-03 00:11:12.843 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:17 compute-1 nova_compute[187157]: 2025-12-03 00:11:17.710 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:17 compute-1 nova_compute[187157]: 2025-12-03 00:11:17.845 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: ERROR   00:11:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: ERROR   00:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: ERROR   00:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: ERROR   00:11:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: ERROR   00:11:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:11:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:11:20 compute-1 podman[216331]: 2025-12-03 00:11:20.208738627 +0000 UTC m=+0.052346786 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:11:22 compute-1 nova_compute[187157]: 2025-12-03 00:11:22.711 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:22 compute-1 nova_compute[187157]: 2025-12-03 00:11:22.846 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:26 compute-1 podman[216356]: 2025-12-03 00:11:26.239507835 +0000 UTC m=+0.075618593 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Dec 03 00:11:26 compute-1 podman[216357]: 2025-12-03 00:11:26.262566037 +0000 UTC m=+0.097024965 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 03 00:11:27 compute-1 nova_compute[187157]: 2025-12-03 00:11:27.712 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:27 compute-1 nova_compute[187157]: 2025-12-03 00:11:27.908 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:32 compute-1 nova_compute[187157]: 2025-12-03 00:11:32.714 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:32 compute-1 nova_compute[187157]: 2025-12-03 00:11:32.910 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:35 compute-1 sshd-session[216397]: Invalid user solana from 193.32.162.146 port 41544
Dec 03 00:11:35 compute-1 sshd-session[216397]: Connection closed by invalid user solana 193.32.162.146 port 41544 [preauth]
Dec 03 00:11:35 compute-1 podman[197537]: time="2025-12-03T00:11:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:11:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:11:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:11:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:11:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2611 "" "Go-http-client/1.1"
Dec 03 00:11:37 compute-1 nova_compute[187157]: 2025-12-03 00:11:37.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:37 compute-1 nova_compute[187157]: 2025-12-03 00:11:37.716 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:37 compute-1 nova_compute[187157]: 2025-12-03 00:11:37.956 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:38 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:11:38 compute-1 podman[216400]: 2025-12-03 00:11:38.70723486 +0000 UTC m=+0.074365333 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Dec 03 00:11:41 compute-1 nova_compute[187157]: 2025-12-03 00:11:41.258 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:41 compute-1 nova_compute[187157]: 2025-12-03 00:11:41.258 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:41 compute-1 nova_compute[187157]: 2025-12-03 00:11:41.763 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:11:42 compute-1 podman[216421]: 2025-12-03 00:11:42.215396988 +0000 UTC m=+0.056407053 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:11:42 compute-1 nova_compute[187157]: 2025-12-03 00:11:42.319 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:42 compute-1 nova_compute[187157]: 2025-12-03 00:11:42.319 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:42 compute-1 nova_compute[187157]: 2025-12-03 00:11:42.327 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:11:42 compute-1 nova_compute[187157]: 2025-12-03 00:11:42.328 187161 INFO nova.compute.claims [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:11:42 compute-1 nova_compute[187157]: 2025-12-03 00:11:42.717 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:43 compute-1 nova_compute[187157]: 2025-12-03 00:11:43.004 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:43 compute-1 nova_compute[187157]: 2025-12-03 00:11:43.404 187161 DEBUG nova.compute.provider_tree [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:11:43 compute-1 nova_compute[187157]: 2025-12-03 00:11:43.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:43 compute-1 nova_compute[187157]: 2025-12-03 00:11:43.910 187161 DEBUG nova.scheduler.client.report [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.435 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.436 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.946 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.947 187161 DEBUG nova.network.neutron [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.947 187161 WARNING neutronclient.v2_0.client [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:44 compute-1 nova_compute[187157]: 2025-12-03 00:11:44.947 187161 WARNING neutronclient.v2_0.client [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.212 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.326 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.327 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.344 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.345 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5819MB free_disk=73.16615295410156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.345 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.346 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.454 187161 INFO nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:11:45 compute-1 nova_compute[187157]: 2025-12-03 00:11:45.963 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:11:46 compute-1 nova_compute[187157]: 2025-12-03 00:11:46.390 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance c3f975c5-7a53-467c-b760-375c84eb4469 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:11:46 compute-1 nova_compute[187157]: 2025-12-03 00:11:46.391 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:11:46 compute-1 nova_compute[187157]: 2025-12-03 00:11:46.391 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:11:45 up  1:18,  0 user,  load average: 0.16, 0.34, 0.36\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_8618acb8fd774a27ac00f4e0f10b934c': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:11:46 compute-1 nova_compute[187157]: 2025-12-03 00:11:46.443 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:11:47 compute-1 nova_compute[187157]: 2025-12-03 00:11:47.718 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.006 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.170 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.255 187161 DEBUG nova.network.neutron [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Successfully created port: 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.636 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.637 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.637 187161 INFO nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Creating image(s)
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.638 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "/var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.638 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "/var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.639 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "/var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.640 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.644 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.645 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.680 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.681 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.335s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.694 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.695 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.696 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.696 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.699 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.699 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.747 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.748 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.781 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.782 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.782 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.830 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.831 187161 DEBUG nova.virt.disk.api [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Checking if we can resize image /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.832 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.897 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.898 187161 DEBUG nova.virt.disk.api [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Cannot resize image /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.899 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.899 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Ensure instance console log exists: /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.900 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.900 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:48 compute-1 nova_compute[187157]: 2025-12-03 00:11:48.900 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:49 compute-1 nova_compute[187157]: 2025-12-03 00:11:49.092 187161 DEBUG nova.network.neutron [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Successfully updated port: 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: ERROR   00:11:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: ERROR   00:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: ERROR   00:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: ERROR   00:11:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: ERROR   00:11:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:11:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:11:49 compute-1 nova_compute[187157]: 2025-12-03 00:11:49.681 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.141 187161 DEBUG nova.compute.manager [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-changed-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.141 187161 DEBUG nova.compute.manager [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Refreshing instance network info cache due to event network-changed-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.142 187161 DEBUG oslo_concurrency.lockutils [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c3f975c5-7a53-467c-b760-375c84eb4469" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.142 187161 DEBUG oslo_concurrency.lockutils [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c3f975c5-7a53-467c-b760-375c84eb4469" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.142 187161 DEBUG nova.network.neutron [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Refreshing network info cache for port 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.638 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "refresh_cache-c3f975c5-7a53-467c-b760-375c84eb4469" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.663 187161 WARNING neutronclient.v2_0.client [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:50 compute-1 nova_compute[187157]: 2025-12-03 00:11:50.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:51 compute-1 podman[216457]: 2025-12-03 00:11:51.228244068 +0000 UTC m=+0.073957324 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:11:51 compute-1 nova_compute[187157]: 2025-12-03 00:11:51.645 187161 DEBUG nova.network.neutron [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:11:52 compute-1 nova_compute[187157]: 2025-12-03 00:11:52.057 187161 DEBUG nova.network.neutron [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:11:52 compute-1 nova_compute[187157]: 2025-12-03 00:11:52.561 187161 DEBUG oslo_concurrency.lockutils [req-a2f52ee1-1e65-4d41-ac0c-9787ccd4a19b req-cdeb6f82-d683-4d07-ab30-025247880811 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c3f975c5-7a53-467c-b760-375c84eb4469" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:11:52 compute-1 nova_compute[187157]: 2025-12-03 00:11:52.562 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquired lock "refresh_cache-c3f975c5-7a53-467c-b760-375c84eb4469" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:11:52 compute-1 nova_compute[187157]: 2025-12-03 00:11:52.562 187161 DEBUG nova.network.neutron [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:11:52 compute-1 nova_compute[187157]: 2025-12-03 00:11:52.720 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:53 compute-1 nova_compute[187157]: 2025-12-03 00:11:53.007 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:53 compute-1 nova_compute[187157]: 2025-12-03 00:11:53.641 187161 DEBUG nova.network.neutron [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:11:53 compute-1 nova_compute[187157]: 2025-12-03 00:11:53.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:53 compute-1 nova_compute[187157]: 2025-12-03 00:11:53.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:11:53 compute-1 nova_compute[187157]: 2025-12-03 00:11:53.824 187161 WARNING neutronclient.v2_0.client [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.386 187161 DEBUG nova.network.neutron [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Updating instance_info_cache with network_info: [{"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.892 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Releasing lock "refresh_cache-c3f975c5-7a53-467c-b760-375c84eb4469" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.892 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Instance network_info: |[{"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.895 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Start _get_guest_xml network_info=[{"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.898 187161 WARNING nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.899 187161 DEBUG nova.virt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-945677260', uuid='c3f975c5-7a53-467c-b760-375c84eb4469'), owner=OwnerMeta(userid='6048ff4ab0aa45689a23ca16a6558b9d', username='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin', projectid='8618acb8fd774a27ac00f4e0f10b934c', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720714.8997838) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.902 187161 DEBUG nova.virt.libvirt.host [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.903 187161 DEBUG nova.virt.libvirt.host [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.905 187161 DEBUG nova.virt.libvirt.host [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.906 187161 DEBUG nova.virt.libvirt.host [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.906 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.907 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.907 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.907 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.908 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.908 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.908 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.908 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.908 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.908 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.909 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.909 187161 DEBUG nova.virt.hardware [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.912 187161 DEBUG nova.virt.libvirt.vif [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:11:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-945677260',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-945677260',id=21,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0o5yx18k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:11:46Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=c3f975c5-7a53-467c-b760-375c84eb4469,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.912 187161 DEBUG nova.network.os_vif_util [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converting VIF {"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.913 187161 DEBUG nova.network.os_vif_util [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:11:54 compute-1 nova_compute[187157]: 2025-12-03 00:11:54.913 187161 DEBUG nova.objects.instance [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lazy-loading 'pci_devices' on Instance uuid c3f975c5-7a53-467c-b760-375c84eb4469 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.421 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <uuid>c3f975c5-7a53-467c-b760-375c84eb4469</uuid>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <name>instance-00000015</name>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-945677260</nova:name>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:11:54</nova:creationTime>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:11:55 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:11:55 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:user uuid="6048ff4ab0aa45689a23ca16a6558b9d">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin</nova:user>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:project uuid="8618acb8fd774a27ac00f4e0f10b934c">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013</nova:project>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         <nova:port uuid="4b0a44ca-2766-4f93-94c8-d849e3b3a6ed">
Dec 03 00:11:55 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <system>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <entry name="serial">c3f975c5-7a53-467c-b760-375c84eb4469</entry>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <entry name="uuid">c3f975c5-7a53-467c-b760-375c84eb4469</entry>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </system>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <os>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </os>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <features>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </features>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.config"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:af:45:d1"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <target dev="tap4b0a44ca-27"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/console.log" append="off"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <video>
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </video>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:11:55 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:11:55 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:11:55 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:11:55 compute-1 nova_compute[187157]: </domain>
Dec 03 00:11:55 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.424 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Preparing to wait for external event network-vif-plugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.424 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.425 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.425 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.426 187161 DEBUG nova.virt.libvirt.vif [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:11:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-945677260',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-945677260',id=21,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0o5yx18k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:11:46Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=c3f975c5-7a53-467c-b760-375c84eb4469,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.427 187161 DEBUG nova.network.os_vif_util [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converting VIF {"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.428 187161 DEBUG nova.network.os_vif_util [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.428 187161 DEBUG os_vif [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.429 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.430 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.430 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.432 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0dd9b1d3-458d-5834-a8ae-d4e8849566cb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.464 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.467 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.470 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.470 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b0a44ca-27, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.470 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4b0a44ca-27, col_values=(('qos', UUID('92de6998-58eb-4906-ba7a-0d3f715f6391')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.471 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4b0a44ca-27, col_values=(('external_ids', {'iface-id': '4b0a44ca-2766-4f93-94c8-d849e3b3a6ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:45:d1', 'vm-uuid': 'c3f975c5-7a53-467c-b760-375c84eb4469'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.471 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-1 NetworkManager[55553]: <info>  [1764720715.4730] manager: (tap4b0a44ca-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.474 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.476 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.477 187161 INFO os_vif [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27')
Dec 03 00:11:55 compute-1 nova_compute[187157]: 2025-12-03 00:11:55.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.011 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.011 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.012 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] No VIF found with MAC fa:16:3e:af:45:d1, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.012 187161 INFO nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Using config drive
Dec 03 00:11:57 compute-1 podman[216483]: 2025-12-03 00:11:57.199297211 +0000 UTC m=+0.044733353 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:11:57 compute-1 podman[216484]: 2025-12-03 00:11:57.229385022 +0000 UTC m=+0.072746525 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.521 187161 WARNING neutronclient.v2_0.client [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.726 187161 INFO nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Creating config drive at /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.config
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.730 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpfabxmjte execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.857 187161 DEBUG oslo_concurrency.processutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpfabxmjte" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:57 compute-1 kernel: tap4b0a44ca-27: entered promiscuous mode
Dec 03 00:11:57 compute-1 ovn_controller[95464]: 2025-12-03T00:11:57Z|00189|binding|INFO|Claiming lport 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed for this chassis.
Dec 03 00:11:57 compute-1 ovn_controller[95464]: 2025-12-03T00:11:57Z|00190|binding|INFO|4b0a44ca-2766-4f93-94c8-d849e3b3a6ed: Claiming fa:16:3e:af:45:d1 10.100.0.4
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.921 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:57 compute-1 NetworkManager[55553]: <info>  [1764720717.9221] manager: (tap4b0a44ca-27): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.924 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.927 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.936 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:45:d1 10.100.0.4'], port_security=['fa:16:3e:af:45:d1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c3f975c5-7a53-467c-b760-375c84eb4469', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1aec6d5f-c8c6-4b74-ad3d-5af55712b2e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.937 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed in datapath 44651134-dca8-45c2-963a-1f17aac67593 bound to our chassis
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.938 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.949 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2b321122-5f5e-4fa4-a0e8-a0d00f5dd5d9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.950 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44651134-d1 in ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:11:57 compute-1 systemd-machined[153454]: New machine qemu-17-instance-00000015.
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.952 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44651134-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.952 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[45decc20-fb6e-4c1f-a2f6-88be2b5d693e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:57 compute-1 systemd-udevd[216546]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.953 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ba89f5-1442-423f-b017-0fb52e3bb0df]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.965 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[4741d08b-f428-4bcb-9dcb-2774221ff5d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:57 compute-1 NetworkManager[55553]: <info>  [1764720717.9662] device (tap4b0a44ca-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:11:57 compute-1 NetworkManager[55553]: <info>  [1764720717.9675] device (tap4b0a44ca-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:11:57 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000015.
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.980 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:57 compute-1 ovn_controller[95464]: 2025-12-03T00:11:57Z|00191|binding|INFO|Setting lport 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed ovn-installed in OVS
Dec 03 00:11:57 compute-1 ovn_controller[95464]: 2025-12-03T00:11:57Z|00192|binding|INFO|Setting lport 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed up in Southbound
Dec 03 00:11:57 compute-1 nova_compute[187157]: 2025-12-03 00:11:57.985 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:57 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:57.988 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5998caf9-0d76-4a98-9b68-960f68820362]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.009 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.013 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0393f2-f5c6-45c6-97cc-1f4700f68f1e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.018 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[defea035-3817-4fa6-b7e3-4506b117748d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 systemd-udevd[216550]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:11:58 compute-1 NetworkManager[55553]: <info>  [1764720718.0197] manager: (tap44651134-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.048 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[61954cff-557b-40b9-9267-f25bd6069d03]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.051 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[40e4c166-9ea1-4435-a5bb-6ed706292e07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 NetworkManager[55553]: <info>  [1764720718.0703] device (tap44651134-d0): carrier: link connected
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.075 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ea756d-08e0-43d8-847a-105c681bf590]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.089 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e3534f-9b05-4bf0-8389-4d80abde7a3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44651134-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:a0:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473772, 'reachable_time': 31600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216578, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.103 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[951eb8af-a038-47fc-ab5f-b1ffa129e05d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:a0ba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473772, 'tstamp': 473772}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216579, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.118 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[42a5f507-72c6-462e-8443-aeff95efc48c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44651134-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:a0:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473772, 'reachable_time': 31600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216580, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.150 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ece4ae05-0ea4-4e3b-bc66-8a00b46f7c0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.203 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6a77a1a0-2ab5-4a35-87d4-77d6f5d9983c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.205 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44651134-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.205 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.207 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.205 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44651134-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:58 compute-1 NetworkManager[55553]: <info>  [1764720718.2096] manager: (tap44651134-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec 03 00:11:58 compute-1 kernel: tap44651134-d0: entered promiscuous mode
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.211 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.212 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44651134-d0, col_values=(('external_ids', {'iface-id': 'c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.213 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.218 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.220 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4af2fc0f-8ebf-472d-9546-57111ae8b3a5]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.221 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.221 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.221 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 44651134-dca8-45c2-963a-1f17aac67593 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.221 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.222 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0716ec62-f3b5-4aa4-88b2-f9b8f18c5ed6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.222 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.223 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea516b8-8839-4f19-aaa1-30f2ad2a6f0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.223 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID 44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.224 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'env', 'PROCESS_TAG=haproxy-44651134-dca8-45c2-963a-1f17aac67593', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44651134-dca8-45c2-963a-1f17aac67593.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:11:58 compute-1 ovn_controller[95464]: 2025-12-03T00:11:58Z|00193|binding|INFO|Releasing lport c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c from this chassis (sb_readonly=0)
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 podman[216611]: 2025-12-03 00:11:58.591330865 +0000 UTC m=+0.047074120 container create 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:11:58 compute-1 systemd[1]: Started libpod-conmon-8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57.scope.
Dec 03 00:11:58 compute-1 podman[216611]: 2025-12-03 00:11:58.567300199 +0000 UTC m=+0.023043474 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:11:58 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:11:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c541ca7a4c0f119edbbc24ad63b6ac0789bc085b8386caf09bb68d2572afee7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:11:58 compute-1 podman[216611]: 2025-12-03 00:11:58.686580298 +0000 UTC m=+0.142323603 container init 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 03 00:11:58 compute-1 podman[216611]: 2025-12-03 00:11:58.691940957 +0000 UTC m=+0.147684232 container start 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:11:58 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [NOTICE]   (216630) : New worker (216632) forked
Dec 03 00:11:58 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [NOTICE]   (216630) : Loading success.
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.762 187161 DEBUG nova.compute.manager [req-dccdfa90-5ba9-45f4-a175-c3a547a4bab3 req-db6528f1-beab-4106-be47-3342db8f9236 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-plugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.762 187161 DEBUG oslo_concurrency.lockutils [req-dccdfa90-5ba9-45f4-a175-c3a547a4bab3 req-db6528f1-beab-4106-be47-3342db8f9236 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.763 187161 DEBUG oslo_concurrency.lockutils [req-dccdfa90-5ba9-45f4-a175-c3a547a4bab3 req-db6528f1-beab-4106-be47-3342db8f9236 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.763 187161 DEBUG oslo_concurrency.lockutils [req-dccdfa90-5ba9-45f4-a175-c3a547a4bab3 req-db6528f1-beab-4106-be47-3342db8f9236 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.763 187161 DEBUG nova.compute.manager [req-dccdfa90-5ba9-45f4-a175-c3a547a4bab3 req-db6528f1-beab-4106-be47-3342db8f9236 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Processing event network-vif-plugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.821 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.822 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:11:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:11:58.822 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.849 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.881 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.883 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.886 187161 INFO nova.virt.libvirt.driver [-] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Instance spawned successfully.
Dec 03 00:11:58 compute-1 nova_compute[187157]: 2025-12-03 00:11:58.886 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.397 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.398 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.398 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.399 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.399 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.400 187161 DEBUG nova.virt.libvirt.driver [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.909 187161 INFO nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Took 11.27 seconds to spawn the instance on the hypervisor.
Dec 03 00:11:59 compute-1 nova_compute[187157]: 2025-12-03 00:11:59.910 187161 DEBUG nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.442 187161 INFO nova.compute.manager [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Took 18.17 seconds to build instance.
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.473 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.815 187161 DEBUG nova.compute.manager [req-7adb7040-97e5-4f4b-8922-6b95de35b3b9 req-e7053f6f-6523-45e0-afa6-1d60cf4d9a2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-plugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.816 187161 DEBUG oslo_concurrency.lockutils [req-7adb7040-97e5-4f4b-8922-6b95de35b3b9 req-e7053f6f-6523-45e0-afa6-1d60cf4d9a2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.816 187161 DEBUG oslo_concurrency.lockutils [req-7adb7040-97e5-4f4b-8922-6b95de35b3b9 req-e7053f6f-6523-45e0-afa6-1d60cf4d9a2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.816 187161 DEBUG oslo_concurrency.lockutils [req-7adb7040-97e5-4f4b-8922-6b95de35b3b9 req-e7053f6f-6523-45e0-afa6-1d60cf4d9a2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.817 187161 DEBUG nova.compute.manager [req-7adb7040-97e5-4f4b-8922-6b95de35b3b9 req-e7053f6f-6523-45e0-afa6-1d60cf4d9a2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] No waiting events found dispatching network-vif-plugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.817 187161 WARNING nova.compute.manager [req-7adb7040-97e5-4f4b-8922-6b95de35b3b9 req-e7053f6f-6523-45e0-afa6-1d60cf4d9a2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received unexpected event network-vif-plugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed for instance with vm_state active and task_state None.
Dec 03 00:12:00 compute-1 nova_compute[187157]: 2025-12-03 00:12:00.947 187161 DEBUG oslo_concurrency.lockutils [None req-a80f5d40-e383-4da2-a18e-70554a6ff7bc 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.688s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:01.732 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:01.733 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:01.734 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:03 compute-1 nova_compute[187157]: 2025-12-03 00:12:03.011 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:05 compute-1 nova_compute[187157]: 2025-12-03 00:12:05.478 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:05 compute-1 podman[197537]: time="2025-12-03T00:12:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:12:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:12:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:12:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:12:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3077 "" "Go-http-client/1.1"
Dec 03 00:12:08 compute-1 nova_compute[187157]: 2025-12-03 00:12:08.013 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:09 compute-1 podman[216651]: 2025-12-03 00:12:09.248354572 +0000 UTC m=+0.086409101 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Dec 03 00:12:10 compute-1 nova_compute[187157]: 2025-12-03 00:12:10.481 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:12 compute-1 ovn_controller[95464]: 2025-12-03T00:12:12Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:45:d1 10.100.0.4
Dec 03 00:12:12 compute-1 ovn_controller[95464]: 2025-12-03T00:12:12Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:45:d1 10.100.0.4
Dec 03 00:12:13 compute-1 nova_compute[187157]: 2025-12-03 00:12:13.049 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:13 compute-1 podman[216690]: 2025-12-03 00:12:13.216444055 +0000 UTC m=+0.057403210 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Dec 03 00:12:15 compute-1 nova_compute[187157]: 2025-12-03 00:12:15.484 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:18 compute-1 nova_compute[187157]: 2025-12-03 00:12:18.051 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: ERROR   00:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: ERROR   00:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: ERROR   00:12:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: ERROR   00:12:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: ERROR   00:12:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:12:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:12:20 compute-1 nova_compute[187157]: 2025-12-03 00:12:20.486 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:22 compute-1 podman[216711]: 2025-12-03 00:12:22.207582878 +0000 UTC m=+0.055354731 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:12:23 compute-1 nova_compute[187157]: 2025-12-03 00:12:23.053 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:25 compute-1 nova_compute[187157]: 2025-12-03 00:12:25.490 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:27 compute-1 nova_compute[187157]: 2025-12-03 00:12:27.372 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Creating tmpfile /var/lib/nova/instances/tmphtz5c8zp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:12:27 compute-1 nova_compute[187157]: 2025-12-03 00:12:27.373 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:27 compute-1 nova_compute[187157]: 2025-12-03 00:12:27.384 187161 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:12:27 compute-1 podman[216736]: 2025-12-03 00:12:27.454731702 +0000 UTC m=+0.052378877 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:12:27 compute-1 podman[216738]: 2025-12-03 00:12:27.480079196 +0000 UTC m=+0.075751373 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:12:28 compute-1 nova_compute[187157]: 2025-12-03 00:12:28.097 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:28 compute-1 ovn_controller[95464]: 2025-12-03T00:12:28Z|00194|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:12:29 compute-1 nova_compute[187157]: 2025-12-03 00:12:29.419 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:30 compute-1 nova_compute[187157]: 2025-12-03 00:12:30.494 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:33 compute-1 nova_compute[187157]: 2025-12-03 00:12:33.100 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:33 compute-1 nova_compute[187157]: 2025-12-03 00:12:33.764 187161 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb1b85fe-471a-46bd-9929-c377144cb8eb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:12:34 compute-1 nova_compute[187157]: 2025-12-03 00:12:34.785 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:12:34 compute-1 nova_compute[187157]: 2025-12-03 00:12:34.786 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:12:34 compute-1 nova_compute[187157]: 2025-12-03 00:12:34.786 187161 DEBUG nova.network.neutron [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:12:35 compute-1 nova_compute[187157]: 2025-12-03 00:12:35.292 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:35 compute-1 nova_compute[187157]: 2025-12-03 00:12:35.497 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:35 compute-1 podman[197537]: time="2025-12-03T00:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:12:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:12:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3069 "" "Go-http-client/1.1"
Dec 03 00:12:36 compute-1 nova_compute[187157]: 2025-12-03 00:12:36.734 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:36 compute-1 nova_compute[187157]: 2025-12-03 00:12:36.943 187161 DEBUG nova.network.neutron [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.449 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.461 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb1b85fe-471a-46bd-9929-c377144cb8eb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.462 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Creating instance directory: /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.462 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Creating disk.info with the contents: {'/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk': 'qcow2', '/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.463 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.463 187161 DEBUG nova.objects.instance [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb1b85fe-471a-46bd-9929-c377144cb8eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.969 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.972 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:12:37 compute-1 nova_compute[187157]: 2025-12-03 00:12:37.974 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.024 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.025 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.025 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.026 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.029 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.029 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.088 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.088 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.119 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.120 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.121 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.151 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.172 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.173 187161 DEBUG nova.virt.disk.api [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.173 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.222 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.223 187161 DEBUG nova.virt.disk.api [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.224 187161 DEBUG nova.objects.instance [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid eb1b85fe-471a-46bd-9929-c377144cb8eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.736 187161 DEBUG nova.objects.base [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<eb1b85fe-471a-46bd-9929-c377144cb8eb> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.736 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.757 187161 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config 497664" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.757 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.759 187161 DEBUG nova.virt.libvirt.vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1319059932',id=20,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:11:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0nyztijh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:11:34Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.759 187161 DEBUG nova.network.os_vif_util [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.760 187161 DEBUG nova.network.os_vif_util [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.760 187161 DEBUG os_vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.761 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.761 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.762 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.762 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.763 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '72379746-54a5-5b3c-8e2b-3b5fd7f8917b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.764 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.765 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.767 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.767 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf36e34c0-cc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.767 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf36e34c0-cc, col_values=(('qos', UUID('0a16c77c-4bbd-4e82-a2cb-fe824cd7efa3')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.767 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf36e34c0-cc, col_values=(('external_ids', {'iface-id': 'f36e34c0-cc70-4a73-b904-d40c504fefa3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:f1:9d', 'vm-uuid': 'eb1b85fe-471a-46bd-9929-c377144cb8eb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.768 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 NetworkManager[55553]: <info>  [1764720758.7693] manager: (tapf36e34c0-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.770 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.773 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.774 187161 INFO os_vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc')
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.774 187161 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.774 187161 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb1b85fe-471a-46bd-9929-c377144cb8eb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.775 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:38 compute-1 nova_compute[187157]: 2025-12-03 00:12:38.859 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:39 compute-1 nova_compute[187157]: 2025-12-03 00:12:39.129 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:39.129 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:12:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:39.130 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:12:39 compute-1 nova_compute[187157]: 2025-12-03 00:12:39.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:40 compute-1 nova_compute[187157]: 2025-12-03 00:12:40.031 187161 DEBUG nova.network.neutron [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Port f36e34c0-cc70-4a73-b904-d40c504fefa3 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:12:40 compute-1 nova_compute[187157]: 2025-12-03 00:12:40.044 187161 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb1b85fe-471a-46bd-9929-c377144cb8eb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:12:40 compute-1 podman[216800]: 2025-12-03 00:12:40.213252688 +0000 UTC m=+0.053199618 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 03 00:12:42 compute-1 nova_compute[187157]: 2025-12-03 00:12:42.208 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:42 compute-1 nova_compute[187157]: 2025-12-03 00:12:42.208 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:12:42 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:12:42 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:12:42 compute-1 kernel: tapf36e34c0-cc: entered promiscuous mode
Dec 03 00:12:42 compute-1 NetworkManager[55553]: <info>  [1764720762.9090] manager: (tapf36e34c0-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Dec 03 00:12:42 compute-1 ovn_controller[95464]: 2025-12-03T00:12:42Z|00195|binding|INFO|Claiming lport f36e34c0-cc70-4a73-b904-d40c504fefa3 for this additional chassis.
Dec 03 00:12:42 compute-1 ovn_controller[95464]: 2025-12-03T00:12:42Z|00196|binding|INFO|f36e34c0-cc70-4a73-b904-d40c504fefa3: Claiming fa:16:3e:ee:f1:9d 10.100.0.13
Dec 03 00:12:42 compute-1 nova_compute[187157]: 2025-12-03 00:12:42.909 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:42.919 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f1:9d 10.100.0.13'], port_security=['fa:16:3e:ee:f1:9d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'eb1b85fe-471a-46bd-9929-c377144cb8eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1aec6d5f-c8c6-4b74-ad3d-5af55712b2e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f36e34c0-cc70-4a73-b904-d40c504fefa3) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:12:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:42.920 104348 INFO neutron.agent.ovn.metadata.agent [-] Port f36e34c0-cc70-4a73-b904-d40c504fefa3 in datapath 44651134-dca8-45c2-963a-1f17aac67593 unbound from our chassis
Dec 03 00:12:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:42.921 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:12:42 compute-1 nova_compute[187157]: 2025-12-03 00:12:42.923 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:42 compute-1 ovn_controller[95464]: 2025-12-03T00:12:42Z|00197|binding|INFO|Setting lport f36e34c0-cc70-4a73-b904-d40c504fefa3 ovn-installed in OVS
Dec 03 00:12:42 compute-1 nova_compute[187157]: 2025-12-03 00:12:42.926 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:42 compute-1 nova_compute[187157]: 2025-12-03 00:12:42.928 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:42.936 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[573084e8-8c3a-4197-a27b-ec321ed93e67]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:42 compute-1 systemd-udevd[216858]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:12:42 compute-1 systemd-machined[153454]: New machine qemu-18-instance-00000014.
Dec 03 00:12:42 compute-1 NetworkManager[55553]: <info>  [1764720762.9551] device (tapf36e34c0-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:12:42 compute-1 NetworkManager[55553]: <info>  [1764720762.9564] device (tapf36e34c0-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:12:42 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-00000014.
Dec 03 00:12:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:42.969 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[85300155-9966-41ac-9ecb-95e01ee93b97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:42 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:42.972 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[163b70c1-7c5a-4c16-b2e2-f6b0697a52c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.007 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[0e767f35-7f4e-4a86-a07b-74856535fca2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.024 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b31c8bc5-931a-4bd9-9036-77684f2c4d6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44651134-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:a0:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473772, 'reachable_time': 31600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216871, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.039 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8c39b2-6fdc-4086-8fe1-3b5fde2149eb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap44651134-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473782, 'tstamp': 473782}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216872, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap44651134-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473785, 'tstamp': 473785}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216872, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.041 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44651134-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:43 compute-1 nova_compute[187157]: 2025-12-03 00:12:43.042 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:43 compute-1 nova_compute[187157]: 2025-12-03 00:12:43.043 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.043 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44651134-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.044 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.044 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44651134-d0, col_values=(('external_ids', {'iface-id': 'c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.044 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:12:43 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:43.045 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[93bd6858-e48a-4d6e-a252-40724be8e0e0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-44651134-dca8-45c2-963a-1f17aac67593\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 44651134-dca8-45c2-963a-1f17aac67593\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:43 compute-1 nova_compute[187157]: 2025-12-03 00:12:43.152 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:43 compute-1 nova_compute[187157]: 2025-12-03 00:12:43.768 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:44 compute-1 nova_compute[187157]: 2025-12-03 00:12:44.209 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:44 compute-1 nova_compute[187157]: 2025-12-03 00:12:44.209 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:44 compute-1 podman[216896]: 2025-12-03 00:12:44.226319364 +0000 UTC m=+0.062033055 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:12:46 compute-1 nova_compute[187157]: 2025-12-03 00:12:46.210 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:46 compute-1 ovn_controller[95464]: 2025-12-03T00:12:46Z|00198|binding|INFO|Claiming lport f36e34c0-cc70-4a73-b904-d40c504fefa3 for this chassis.
Dec 03 00:12:46 compute-1 ovn_controller[95464]: 2025-12-03T00:12:46Z|00199|binding|INFO|f36e34c0-cc70-4a73-b904-d40c504fefa3: Claiming fa:16:3e:ee:f1:9d 10.100.0.13
Dec 03 00:12:46 compute-1 ovn_controller[95464]: 2025-12-03T00:12:46Z|00200|binding|INFO|Setting lport f36e34c0-cc70-4a73-b904-d40c504fefa3 up in Southbound
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.000 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.002 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.002 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.002 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.712 187161 INFO nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Post operation of migration started
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.713 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.797 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.798 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.885 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.886 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:12:47 compute-1 nova_compute[187157]: 2025-12-03 00:12:47.886 187161 DEBUG nova.network.neutron [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.054 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.111 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.112 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.152 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.182 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.186 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.234 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.235 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.282 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.393 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.413 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.414 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.430 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.431 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5516MB free_disk=73.10822677612305GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.431 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.431 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.810 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.812 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:48 compute-1 sshd-session[216916]: Invalid user admin from 185.156.73.233 port 23696
Dec 03 00:12:48 compute-1 nova_compute[187157]: 2025-12-03 00:12:48.972 187161 DEBUG nova.network.neutron [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:12:49 compute-1 sshd-session[216916]: Connection closed by invalid user admin 185.156.73.233 port 23696 [preauth]
Dec 03 00:12:49 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:12:49.131 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: ERROR   00:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: ERROR   00:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: ERROR   00:12:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: ERROR   00:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: ERROR   00:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:12:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:12:49 compute-1 nova_compute[187157]: 2025-12-03 00:12:49.449 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance eb1b85fe-471a-46bd-9929-c377144cb8eb refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:12:49 compute-1 nova_compute[187157]: 2025-12-03 00:12:49.479 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:12:49 compute-1 nova_compute[187157]: 2025-12-03 00:12:49.955 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating resource usage from migration 8e05e9e2-afa5-4a3a-a936-f7c5663fe52f
Dec 03 00:12:49 compute-1 nova_compute[187157]: 2025-12-03 00:12:49.955 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Starting to track incoming migration 8e05e9e2-afa5-4a3a-a936-f7c5663fe52f with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:12:49 compute-1 nova_compute[187157]: 2025-12-03 00:12:49.999 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:50 compute-1 nova_compute[187157]: 2025-12-03 00:12:50.489 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance c3f975c5-7a53-467c-b760-375c84eb4469 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:12:50 compute-1 nova_compute[187157]: 2025-12-03 00:12:50.996 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance eb1b85fe-471a-46bd-9929-c377144cb8eb has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:12:50 compute-1 nova_compute[187157]: 2025-12-03 00:12:50.996 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:12:50 compute-1 nova_compute[187157]: 2025-12-03 00:12:50.996 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:12:48 up  1:19,  0 user,  load average: 0.32, 0.35, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8618acb8fd774a27ac00f4e0f10b934c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:12:51 compute-1 nova_compute[187157]: 2025-12-03 00:12:51.041 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:12:51 compute-1 nova_compute[187157]: 2025-12-03 00:12:51.548 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:12:52 compute-1 nova_compute[187157]: 2025-12-03 00:12:52.061 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:12:52 compute-1 nova_compute[187157]: 2025-12-03 00:12:52.062 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.630s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:52 compute-1 nova_compute[187157]: 2025-12-03 00:12:52.062 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 2.063s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:52 compute-1 nova_compute[187157]: 2025-12-03 00:12:52.062 187161 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:52 compute-1 nova_compute[187157]: 2025-12-03 00:12:52.066 187161 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:12:52 compute-1 virtqemud[186882]: Domain id=18 name='instance-00000014' uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb is tainted: custom-monitor
Dec 03 00:12:53 compute-1 nova_compute[187157]: 2025-12-03 00:12:53.073 187161 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:12:53 compute-1 nova_compute[187157]: 2025-12-03 00:12:53.155 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:53 compute-1 podman[216932]: 2025-12-03 00:12:53.215287873 +0000 UTC m=+0.052443019 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:12:53 compute-1 nova_compute[187157]: 2025-12-03 00:12:53.552 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:53 compute-1 nova_compute[187157]: 2025-12-03 00:12:53.552 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:53 compute-1 nova_compute[187157]: 2025-12-03 00:12:53.814 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:54 compute-1 nova_compute[187157]: 2025-12-03 00:12:54.079 187161 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:12:54 compute-1 nova_compute[187157]: 2025-12-03 00:12:54.084 187161 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:12:54 compute-1 nova_compute[187157]: 2025-12-03 00:12:54.596 187161 DEBUG nova.objects.instance [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:12:54 compute-1 nova_compute[187157]: 2025-12-03 00:12:54.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:54 compute-1 nova_compute[187157]: 2025-12-03 00:12:54.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:12:55 compute-1 nova_compute[187157]: 2025-12-03 00:12:55.211 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:12:55 compute-1 nova_compute[187157]: 2025-12-03 00:12:55.618 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:55 compute-1 nova_compute[187157]: 2025-12-03 00:12:55.733 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:55 compute-1 nova_compute[187157]: 2025-12-03 00:12:55.733 187161 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:56 compute-1 nova_compute[187157]: 2025-12-03 00:12:56.212 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:56 compute-1 nova_compute[187157]: 2025-12-03 00:12:56.212 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:12:56 compute-1 nova_compute[187157]: 2025-12-03 00:12:56.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:56 compute-1 nova_compute[187157]: 2025-12-03 00:12:56.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:58 compute-1 nova_compute[187157]: 2025-12-03 00:12:58.157 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:58 compute-1 podman[216958]: 2025-12-03 00:12:58.214224221 +0000 UTC m=+0.053191627 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:12:58 compute-1 podman[216959]: 2025-12-03 00:12:58.238084757 +0000 UTC m=+0.073993127 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:12:58 compute-1 nova_compute[187157]: 2025-12-03 00:12:58.815 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:59 compute-1 nova_compute[187157]: 2025-12-03 00:12:59.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:59 compute-1 nova_compute[187157]: 2025-12-03 00:12:59.991 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:59 compute-1 nova_compute[187157]: 2025-12-03 00:12:59.991 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:59 compute-1 nova_compute[187157]: 2025-12-03 00:12:59.991 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:59 compute-1 nova_compute[187157]: 2025-12-03 00:12:59.991 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:59 compute-1 nova_compute[187157]: 2025-12-03 00:12:59.992 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.006 187161 INFO nova.compute.manager [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Terminating instance
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.525 187161 DEBUG nova.compute.manager [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:13:00 compute-1 kernel: tap4b0a44ca-27 (unregistering): left promiscuous mode
Dec 03 00:13:00 compute-1 NetworkManager[55553]: <info>  [1764720780.5445] device (tap4b0a44ca-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.554 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:00 compute-1 ovn_controller[95464]: 2025-12-03T00:13:00Z|00201|binding|INFO|Releasing lport 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed from this chassis (sb_readonly=0)
Dec 03 00:13:00 compute-1 ovn_controller[95464]: 2025-12-03T00:13:00Z|00202|binding|INFO|Setting lport 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed down in Southbound
Dec 03 00:13:00 compute-1 ovn_controller[95464]: 2025-12-03T00:13:00Z|00203|binding|INFO|Removing iface tap4b0a44ca-27 ovn-installed in OVS
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.557 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.562 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:45:d1 10.100.0.4'], port_security=['fa:16:3e:af:45:d1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c3f975c5-7a53-467c-b760-375c84eb4469', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1aec6d5f-c8c6-4b74-ad3d-5af55712b2e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.563 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed in datapath 44651134-dca8-45c2-963a-1f17aac67593 unbound from our chassis
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.565 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.576 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.584 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[882a2f80-105a-4ee6-aa21-1f61b1e259b0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000015.scope: Deactivated successfully.
Dec 03 00:13:00 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000015.scope: Consumed 15.469s CPU time.
Dec 03 00:13:00 compute-1 systemd-machined[153454]: Machine qemu-17-instance-00000015 terminated.
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.613 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[380b3318-286d-4f57-a322-8f7c19c2971f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.616 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[94008634-5f8b-4d82-8144-2e0dc03963e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.640 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[c44f762a-cfd7-4fa7-8451-875a51ccb045]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.655 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[44b28421-9f7c-4213-9ae1-3c52e5384c76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44651134-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:a0:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473772, 'reachable_time': 31600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217013, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.676 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f0728fb7-6bdb-436a-a43f-85749d09744f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap44651134-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473782, 'tstamp': 473782}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217014, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap44651134-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473785, 'tstamp': 473785}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217014, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.677 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44651134-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.678 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.682 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.682 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44651134-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.682 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.682 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44651134-d0, col_values=(('external_ids', {'iface-id': 'c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.683 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:13:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:00.684 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e477879d-b04f-4a75-9211-066c58f96d84]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-44651134-dca8-45c2-963a-1f17aac67593\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 44651134-dca8-45c2-963a-1f17aac67593\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.710 187161 DEBUG nova.compute.manager [req-44b4b730-6aa2-4af1-b7ac-49f9dec3444c req-2cbc190a-7885-456c-944d-f2eb462f1273 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-unplugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.710 187161 DEBUG oslo_concurrency.lockutils [req-44b4b730-6aa2-4af1-b7ac-49f9dec3444c req-2cbc190a-7885-456c-944d-f2eb462f1273 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.710 187161 DEBUG oslo_concurrency.lockutils [req-44b4b730-6aa2-4af1-b7ac-49f9dec3444c req-2cbc190a-7885-456c-944d-f2eb462f1273 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.710 187161 DEBUG oslo_concurrency.lockutils [req-44b4b730-6aa2-4af1-b7ac-49f9dec3444c req-2cbc190a-7885-456c-944d-f2eb462f1273 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.711 187161 DEBUG nova.compute.manager [req-44b4b730-6aa2-4af1-b7ac-49f9dec3444c req-2cbc190a-7885-456c-944d-f2eb462f1273 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] No waiting events found dispatching network-vif-unplugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.711 187161 DEBUG nova.compute.manager [req-44b4b730-6aa2-4af1-b7ac-49f9dec3444c req-2cbc190a-7885-456c-944d-f2eb462f1273 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-unplugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.788 187161 INFO nova.virt.libvirt.driver [-] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Instance destroyed successfully.
Dec 03 00:13:00 compute-1 nova_compute[187157]: 2025-12-03 00:13:00.788 187161 DEBUG nova.objects.instance [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lazy-loading 'resources' on Instance uuid c3f975c5-7a53-467c-b760-375c84eb4469 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.297 187161 DEBUG nova.virt.libvirt.vif [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:11:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-945677260',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-945677260',id=21,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:11:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0o5yx18k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:11:59Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=c3f975c5-7a53-467c-b760-375c84eb4469,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.298 187161 DEBUG nova.network.os_vif_util [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converting VIF {"id": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "address": "fa:16:3e:af:45:d1", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b0a44ca-27", "ovs_interfaceid": "4b0a44ca-2766-4f93-94c8-d849e3b3a6ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.299 187161 DEBUG nova.network.os_vif_util [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.299 187161 DEBUG os_vif [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.300 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.300 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b0a44ca-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.302 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.304 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.305 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.305 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=92de6998-58eb-4906-ba7a-0d3f715f6391) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.306 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.307 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.308 187161 INFO os_vif [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:45:d1,bridge_name='br-int',has_traffic_filtering=True,id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b0a44ca-27')
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.309 187161 INFO nova.virt.libvirt.driver [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Deleting instance files /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469_del
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.309 187161 INFO nova.virt.libvirt.driver [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Deletion of /var/lib/nova/instances/c3f975c5-7a53-467c-b760-375c84eb4469_del complete
Dec 03 00:13:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:01.735 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:01.735 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:01.736 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.821 187161 INFO nova.compute.manager [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Took 1.30 seconds to destroy the instance on the hypervisor.
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.821 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.822 187161 DEBUG nova.compute.manager [-] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.822 187161 DEBUG nova.network.neutron [-] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:13:01 compute-1 nova_compute[187157]: 2025-12-03 00:13:01.822 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.065 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.440 187161 DEBUG nova.compute.manager [req-9dbadb3f-1d66-48f8-8c59-c4d8e6bc9a3d req-18bcf3a9-e784-470a-a834-2518d990fbe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-deleted-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.440 187161 INFO nova.compute.manager [req-9dbadb3f-1d66-48f8-8c59-c4d8e6bc9a3d req-18bcf3a9-e784-470a-a834-2518d990fbe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Neutron deleted interface 4b0a44ca-2766-4f93-94c8-d849e3b3a6ed; detaching it from the instance and deleting it from the info cache
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.440 187161 DEBUG nova.network.neutron [req-9dbadb3f-1d66-48f8-8c59-c4d8e6bc9a3d req-18bcf3a9-e784-470a-a834-2518d990fbe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.769 187161 DEBUG nova.compute.manager [req-2934a8ed-f357-4369-98b9-4b9e5ae120c6 req-087b6e91-e484-4c0d-b23d-6c871b79cd61 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-unplugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.769 187161 DEBUG oslo_concurrency.lockutils [req-2934a8ed-f357-4369-98b9-4b9e5ae120c6 req-087b6e91-e484-4c0d-b23d-6c871b79cd61 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.769 187161 DEBUG oslo_concurrency.lockutils [req-2934a8ed-f357-4369-98b9-4b9e5ae120c6 req-087b6e91-e484-4c0d-b23d-6c871b79cd61 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.770 187161 DEBUG oslo_concurrency.lockutils [req-2934a8ed-f357-4369-98b9-4b9e5ae120c6 req-087b6e91-e484-4c0d-b23d-6c871b79cd61 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.770 187161 DEBUG nova.compute.manager [req-2934a8ed-f357-4369-98b9-4b9e5ae120c6 req-087b6e91-e484-4c0d-b23d-6c871b79cd61 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] No waiting events found dispatching network-vif-unplugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.770 187161 DEBUG nova.compute.manager [req-2934a8ed-f357-4369-98b9-4b9e5ae120c6 req-087b6e91-e484-4c0d-b23d-6c871b79cd61 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Received event network-vif-unplugged-4b0a44ca-2766-4f93-94c8-d849e3b3a6ed for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.895 187161 DEBUG nova.network.neutron [-] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:13:02 compute-1 nova_compute[187157]: 2025-12-03 00:13:02.968 187161 DEBUG nova.compute.manager [req-9dbadb3f-1d66-48f8-8c59-c4d8e6bc9a3d req-18bcf3a9-e784-470a-a834-2518d990fbe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Detach interface failed, port_id=4b0a44ca-2766-4f93-94c8-d849e3b3a6ed, reason: Instance c3f975c5-7a53-467c-b760-375c84eb4469 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:13:03 compute-1 nova_compute[187157]: 2025-12-03 00:13:03.160 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:03 compute-1 nova_compute[187157]: 2025-12-03 00:13:03.414 187161 INFO nova.compute.manager [-] [instance: c3f975c5-7a53-467c-b760-375c84eb4469] Took 1.59 seconds to deallocate network for instance.
Dec 03 00:13:03 compute-1 nova_compute[187157]: 2025-12-03 00:13:03.932 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:03 compute-1 nova_compute[187157]: 2025-12-03 00:13:03.932 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:04 compute-1 nova_compute[187157]: 2025-12-03 00:13:04.038 187161 DEBUG nova.compute.provider_tree [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:13:04 compute-1 nova_compute[187157]: 2025-12-03 00:13:04.545 187161 DEBUG nova.scheduler.client.report [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:13:05 compute-1 nova_compute[187157]: 2025-12-03 00:13:05.054 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:05 compute-1 nova_compute[187157]: 2025-12-03 00:13:05.088 187161 INFO nova.scheduler.client.report [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Deleted allocations for instance c3f975c5-7a53-467c-b760-375c84eb4469
Dec 03 00:13:05 compute-1 podman[197537]: time="2025-12-03T00:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:13:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:13:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3070 "" "Go-http-client/1.1"
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.137 187161 DEBUG oslo_concurrency.lockutils [None req-13d60013-d21b-4438-a098-6cead05ed8b5 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "c3f975c5-7a53-467c-b760-375c84eb4469" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.310 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.816 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.817 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.817 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.817 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.817 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:06 compute-1 nova_compute[187157]: 2025-12-03 00:13:06.827 187161 INFO nova.compute.manager [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Terminating instance
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.340 187161 DEBUG nova.compute.manager [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:13:07 compute-1 kernel: tapf36e34c0-cc (unregistering): left promiscuous mode
Dec 03 00:13:07 compute-1 NetworkManager[55553]: <info>  [1764720787.3660] device (tapf36e34c0-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:13:07 compute-1 ovn_controller[95464]: 2025-12-03T00:13:07Z|00204|binding|INFO|Releasing lport f36e34c0-cc70-4a73-b904-d40c504fefa3 from this chassis (sb_readonly=0)
Dec 03 00:13:07 compute-1 ovn_controller[95464]: 2025-12-03T00:13:07Z|00205|binding|INFO|Setting lport f36e34c0-cc70-4a73-b904-d40c504fefa3 down in Southbound
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.372 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:07 compute-1 ovn_controller[95464]: 2025-12-03T00:13:07Z|00206|binding|INFO|Removing iface tapf36e34c0-cc ovn-installed in OVS
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.373 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.385 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f1:9d 10.100.0.13'], port_security=['fa:16:3e:ee:f1:9d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'eb1b85fe-471a-46bd-9929-c377144cb8eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '15', 'neutron:security_group_ids': '1aec6d5f-c8c6-4b74-ad3d-5af55712b2e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=f36e34c0-cc70-4a73-b904-d40c504fefa3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.386 104348 INFO neutron.agent.ovn.metadata.agent [-] Port f36e34c0-cc70-4a73-b904-d40c504fefa3 in datapath 44651134-dca8-45c2-963a-1f17aac67593 unbound from our chassis
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.387 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44651134-dca8-45c2-963a-1f17aac67593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.387 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.388 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe2cf46-1678-413c-b822-59f1cba82e19]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.388 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 namespace which is not needed anymore
Dec 03 00:13:07 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec 03 00:13:07 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000014.scope: Consumed 2.539s CPU time.
Dec 03 00:13:07 compute-1 systemd-machined[153454]: Machine qemu-18-instance-00000014 terminated.
Dec 03 00:13:07 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [NOTICE]   (216630) : haproxy version is 3.0.5-8e879a5
Dec 03 00:13:07 compute-1 podman[217058]: 2025-12-03 00:13:07.494764091 +0000 UTC m=+0.029168048 container kill 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:13:07 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [NOTICE]   (216630) : path to executable is /usr/sbin/haproxy
Dec 03 00:13:07 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [WARNING]  (216630) : Exiting Master process...
Dec 03 00:13:07 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [ALERT]    (216630) : Current worker (216632) exited with code 143 (Terminated)
Dec 03 00:13:07 compute-1 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[216626]: [WARNING]  (216630) : All workers exited. Exiting... (0)
Dec 03 00:13:07 compute-1 systemd[1]: libpod-8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57.scope: Deactivated successfully.
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.514 187161 DEBUG nova.compute.manager [req-0d5bc7ab-cad0-405e-804d-b82d6e2265cb req-daa80866-0912-4b20-83fe-69fbbbd568c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.514 187161 DEBUG oslo_concurrency.lockutils [req-0d5bc7ab-cad0-405e-804d-b82d6e2265cb req-daa80866-0912-4b20-83fe-69fbbbd568c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.515 187161 DEBUG oslo_concurrency.lockutils [req-0d5bc7ab-cad0-405e-804d-b82d6e2265cb req-daa80866-0912-4b20-83fe-69fbbbd568c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.515 187161 DEBUG oslo_concurrency.lockutils [req-0d5bc7ab-cad0-405e-804d-b82d6e2265cb req-daa80866-0912-4b20-83fe-69fbbbd568c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.515 187161 DEBUG nova.compute.manager [req-0d5bc7ab-cad0-405e-804d-b82d6e2265cb req-daa80866-0912-4b20-83fe-69fbbbd568c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.515 187161 DEBUG nova.compute.manager [req-0d5bc7ab-cad0-405e-804d-b82d6e2265cb req-daa80866-0912-4b20-83fe-69fbbbd568c1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:13:07 compute-1 podman[217073]: 2025-12-03 00:13:07.538903694 +0000 UTC m=+0.024646665 container died 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.595 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57-userdata-shm.mount: Deactivated successfully.
Dec 03 00:13:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-5c541ca7a4c0f119edbbc24ad63b6ac0789bc085b8386caf09bb68d2572afee7-merged.mount: Deactivated successfully.
Dec 03 00:13:07 compute-1 podman[217073]: 2025-12-03 00:13:07.60835266 +0000 UTC m=+0.094095621 container cleanup 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:13:07 compute-1 systemd[1]: libpod-conmon-8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57.scope: Deactivated successfully.
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.624 187161 INFO nova.virt.libvirt.driver [-] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance destroyed successfully.
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.624 187161 DEBUG nova.objects.instance [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lazy-loading 'resources' on Instance uuid eb1b85fe-471a-46bd-9929-c377144cb8eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:13:07 compute-1 podman[217075]: 2025-12-03 00:13:07.626961327 +0000 UTC m=+0.104814885 container remove 8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202)
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.632 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f7013ff1-5e80-493f-95bc-fa3338467d7d]: (4, ("Wed Dec  3 12:13:07 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 (8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57)\n8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57\nWed Dec  3 12:13:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 (8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57)\n8fc7686eddf9f48cf047d92ad0841a5d9770ec9aa41d0d101607f19444b80f57\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.633 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[663d356d-a6f9-4ef8-894f-516d693663d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.633 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.633 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[15acd3e3-1a51-4d5c-9333-5cb113df401b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.634 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44651134-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.635 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:07 compute-1 kernel: tap44651134-d0: left promiscuous mode
Dec 03 00:13:07 compute-1 nova_compute[187157]: 2025-12-03 00:13:07.649 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.651 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8632c452-8bd1-409a-b9d8-4754b88bddf2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.665 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[af6b7650-9a6f-4af6-99e1-cb3b25b22846]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.666 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b18ea005-ffeb-4a42-99a1-cd3f7a6358e7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.680 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7796ad84-263c-472f-a6b9-0a36a58c7330]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473766, 'reachable_time': 25562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217120, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.681 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:13:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:07.681 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[65da89d0-7608-47f3-89ae-aeb5cf67a698]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:07 compute-1 systemd[1]: run-netns-ovnmeta\x2d44651134\x2ddca8\x2d45c2\x2d963a\x2d1f17aac67593.mount: Deactivated successfully.
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.133 187161 DEBUG nova.virt.libvirt.vif [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1319059932',id=20,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:11:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0nyztijh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:12:55Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.134 187161 DEBUG nova.network.os_vif_util [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converting VIF {"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.134 187161 DEBUG nova.network.os_vif_util [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.135 187161 DEBUG os_vif [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.136 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.137 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf36e34c0-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.138 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.139 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.140 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.141 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0a16c77c-4bbd-4e82-a2cb-fe824cd7efa3) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.141 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.142 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.143 187161 INFO os_vif [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc')
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.144 187161 INFO nova.virt.libvirt.driver [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Deleting instance files /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb_del
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.144 187161 INFO nova.virt.libvirt.driver [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Deletion of /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb_del complete
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.161 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-1 sshd-session[217121]: Received disconnect from 193.46.255.217 port 59436:11:  [preauth]
Dec 03 00:13:08 compute-1 sshd-session[217121]: Disconnected from authenticating user root 193.46.255.217 port 59436 [preauth]
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.655 187161 INFO nova.compute.manager [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Took 1.31 seconds to destroy the instance on the hypervisor.
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.655 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.655 187161 DEBUG nova.compute.manager [-] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.655 187161 DEBUG nova.network.neutron [-] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.656 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:08 compute-1 nova_compute[187157]: 2025-12-03 00:13:08.796 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.385 187161 DEBUG nova.compute.manager [req-4d71e156-d314-40d9-81e8-af5685f6d79a req-b507ca17-5bf1-4d75-b7e3-771cdf9e639d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-deleted-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.385 187161 INFO nova.compute.manager [req-4d71e156-d314-40d9-81e8-af5685f6d79a req-b507ca17-5bf1-4d75-b7e3-771cdf9e639d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Neutron deleted interface f36e34c0-cc70-4a73-b904-d40c504fefa3; detaching it from the instance and deleting it from the info cache
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.385 187161 DEBUG nova.network.neutron [req-4d71e156-d314-40d9-81e8-af5685f6d79a req-b507ca17-5bf1-4d75-b7e3-771cdf9e639d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.579 187161 DEBUG nova.compute.manager [req-daa537fd-9fc2-4522-849c-fbea1af4e827 req-bc5c5ff5-5efc-4097-9953-24fc4af3dee7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.579 187161 DEBUG oslo_concurrency.lockutils [req-daa537fd-9fc2-4522-849c-fbea1af4e827 req-bc5c5ff5-5efc-4097-9953-24fc4af3dee7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.580 187161 DEBUG oslo_concurrency.lockutils [req-daa537fd-9fc2-4522-849c-fbea1af4e827 req-bc5c5ff5-5efc-4097-9953-24fc4af3dee7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.580 187161 DEBUG oslo_concurrency.lockutils [req-daa537fd-9fc2-4522-849c-fbea1af4e827 req-bc5c5ff5-5efc-4097-9953-24fc4af3dee7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.580 187161 DEBUG nova.compute.manager [req-daa537fd-9fc2-4522-849c-fbea1af4e827 req-bc5c5ff5-5efc-4097-9953-24fc4af3dee7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.580 187161 DEBUG nova.compute.manager [req-daa537fd-9fc2-4522-849c-fbea1af4e827 req-bc5c5ff5-5efc-4097-9953-24fc4af3dee7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.841 187161 DEBUG nova.network.neutron [-] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:13:09 compute-1 nova_compute[187157]: 2025-12-03 00:13:09.891 187161 DEBUG nova.compute.manager [req-4d71e156-d314-40d9-81e8-af5685f6d79a req-b507ca17-5bf1-4d75-b7e3-771cdf9e639d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Detach interface failed, port_id=f36e34c0-cc70-4a73-b904-d40c504fefa3, reason: Instance eb1b85fe-471a-46bd-9929-c377144cb8eb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:13:10 compute-1 nova_compute[187157]: 2025-12-03 00:13:10.347 187161 INFO nova.compute.manager [-] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Took 1.69 seconds to deallocate network for instance.
Dec 03 00:13:10 compute-1 nova_compute[187157]: 2025-12-03 00:13:10.868 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:10 compute-1 nova_compute[187157]: 2025-12-03 00:13:10.868 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:10 compute-1 nova_compute[187157]: 2025-12-03 00:13:10.877 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:10 compute-1 nova_compute[187157]: 2025-12-03 00:13:10.917 187161 INFO nova.scheduler.client.report [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Deleted allocations for instance eb1b85fe-471a-46bd-9929-c377144cb8eb
Dec 03 00:13:11 compute-1 podman[217123]: 2025-12-03 00:13:11.237187101 +0000 UTC m=+0.071549099 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Dec 03 00:13:11 compute-1 nova_compute[187157]: 2025-12-03 00:13:11.948 187161 DEBUG oslo_concurrency.lockutils [None req-398c18fb-974c-4991-b001-e52165412173 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:13 compute-1 nova_compute[187157]: 2025-12-03 00:13:13.143 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:13 compute-1 nova_compute[187157]: 2025-12-03 00:13:13.164 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:15 compute-1 podman[217144]: 2025-12-03 00:13:15.21238947 +0000 UTC m=+0.053003413 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:13:17 compute-1 nova_compute[187157]: 2025-12-03 00:13:17.477 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.165 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.167 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.167 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.167 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.191 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.192 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:18 compute-1 nova_compute[187157]: 2025-12-03 00:13:18.500 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: ERROR   00:13:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: ERROR   00:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: ERROR   00:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: ERROR   00:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: ERROR   00:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:13:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:13:23 compute-1 nova_compute[187157]: 2025-12-03 00:13:23.191 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:23 compute-1 nova_compute[187157]: 2025-12-03 00:13:23.193 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:24 compute-1 podman[217165]: 2025-12-03 00:13:24.227449017 +0000 UTC m=+0.072424509 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:13:28 compute-1 nova_compute[187157]: 2025-12-03 00:13:28.194 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:28 compute-1 nova_compute[187157]: 2025-12-03 00:13:28.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:28 compute-1 nova_compute[187157]: 2025-12-03 00:13:28.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:13:28 compute-1 nova_compute[187157]: 2025-12-03 00:13:28.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:28 compute-1 nova_compute[187157]: 2025-12-03 00:13:28.225 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:28 compute-1 nova_compute[187157]: 2025-12-03 00:13:28.225 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:29 compute-1 podman[217190]: 2025-12-03 00:13:29.207560453 +0000 UTC m=+0.053884314 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:13:29 compute-1 podman[217191]: 2025-12-03 00:13:29.257294794 +0000 UTC m=+0.100714594 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 03 00:13:32 compute-1 sshd-session[217233]: Connection closed by 45.148.10.240 port 60460
Dec 03 00:13:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:32.858 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:2b:fe 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1299554f9c3e4ee7a7991ca25c47f7c1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=42f0d9e7-7c77-4247-8972-6beac3a53206) old=Port_Binding(mac=['fa:16:3e:99:2b:fe'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1299554f9c3e4ee7a7991ca25c47f7c1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:32.859 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 42f0d9e7-7c77-4247-8972-6beac3a53206 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd updated
Dec 03 00:13:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:32.859 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:13:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:32.860 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[11f19d12-bb5a-4121-a80f-027a27dd4fb2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:33 compute-1 nova_compute[187157]: 2025-12-03 00:13:33.226 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:33 compute-1 nova_compute[187157]: 2025-12-03 00:13:33.227 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:33 compute-1 nova_compute[187157]: 2025-12-03 00:13:33.227 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:13:33 compute-1 nova_compute[187157]: 2025-12-03 00:13:33.227 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:33 compute-1 nova_compute[187157]: 2025-12-03 00:13:33.240 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:33 compute-1 nova_compute[187157]: 2025-12-03 00:13:33.240 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:35 compute-1 podman[197537]: time="2025-12-03T00:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:13:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:13:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2612 "" "Go-http-client/1.1"
Dec 03 00:13:38 compute-1 nova_compute[187157]: 2025-12-03 00:13:38.241 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:38 compute-1 nova_compute[187157]: 2025-12-03 00:13:38.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:38 compute-1 nova_compute[187157]: 2025-12-03 00:13:38.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:13:38 compute-1 nova_compute[187157]: 2025-12-03 00:13:38.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:38 compute-1 nova_compute[187157]: 2025-12-03 00:13:38.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:38 compute-1 nova_compute[187157]: 2025-12-03 00:13:38.244 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:39.873 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:39 compute-1 nova_compute[187157]: 2025-12-03 00:13:39.874 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:39.875 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:13:40 compute-1 nova_compute[187157]: 2025-12-03 00:13:40.211 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:40.972 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:35:a1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca79c0e-a98f-49bb-a5b9-e71f73a04bad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8a7bc21f-8de6-41a4-bcee-f8a6bbb9133f) old=Port_Binding(mac=['fa:16:3e:92:35:a1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:40.974 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8a7bc21f-8de6-41a4-bcee-f8a6bbb9133f in datapath 4ec4e0a2-2d69-48a4-b43f-5378b9156efd updated
Dec 03 00:13:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:40.975 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ec4e0a2-2d69-48a4-b43f-5378b9156efd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:13:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:40.976 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1533b06d-333a-4763-ba9a-fea47673a646]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:42 compute-1 podman[217235]: 2025-12-03 00:13:42.20348228 +0000 UTC m=+0.050615704 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 03 00:13:43 compute-1 nova_compute[187157]: 2025-12-03 00:13:43.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:45 compute-1 nova_compute[187157]: 2025-12-03 00:13:45.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:46 compute-1 podman[217256]: 2025-12-03 00:13:46.217409826 +0000 UTC m=+0.061108472 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:13:46 compute-1 nova_compute[187157]: 2025-12-03 00:13:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.218 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.218 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.218 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.355 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.356 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.370 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.371 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5850MB free_disk=73.16611862182617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.371 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:47 compute-1 nova_compute[187157]: 2025-12-03 00:13:47.372 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.244 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.246 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.246 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.246 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.247 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.248 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.534 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.534 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:13:47 up  1:20,  0 user,  load average: 0.42, 0.39, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.584 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.598 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.598 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.610 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.631 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:13:48 compute-1 nova_compute[187157]: 2025-12-03 00:13:48.647 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:13:49 compute-1 nova_compute[187157]: 2025-12-03 00:13:49.152 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: ERROR   00:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: ERROR   00:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: ERROR   00:13:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: ERROR   00:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: ERROR   00:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:13:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:13:49 compute-1 nova_compute[187157]: 2025-12-03 00:13:49.661 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:13:49 compute-1 nova_compute[187157]: 2025-12-03 00:13:49.661 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.290s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:49 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:13:49.876 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:50 compute-1 nova_compute[187157]: 2025-12-03 00:13:50.661 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:51 compute-1 nova_compute[187157]: 2025-12-03 00:13:51.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:53 compute-1 nova_compute[187157]: 2025-12-03 00:13:53.248 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:53 compute-1 nova_compute[187157]: 2025-12-03 00:13:53.250 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:53 compute-1 nova_compute[187157]: 2025-12-03 00:13:53.250 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:13:53 compute-1 nova_compute[187157]: 2025-12-03 00:13:53.250 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:53 compute-1 nova_compute[187157]: 2025-12-03 00:13:53.269 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:53 compute-1 nova_compute[187157]: 2025-12-03 00:13:53.270 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:13:53 compute-1 ovn_controller[95464]: 2025-12-03T00:13:53Z|00207|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 03 00:13:55 compute-1 podman[217279]: 2025-12-03 00:13:55.239353234 +0000 UTC m=+0.078155249 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:13:56 compute-1 nova_compute[187157]: 2025-12-03 00:13:56.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:56 compute-1 nova_compute[187157]: 2025-12-03 00:13:56.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:56 compute-1 nova_compute[187157]: 2025-12-03 00:13:56.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:13:57 compute-1 nova_compute[187157]: 2025-12-03 00:13:57.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:58 compute-1 nova_compute[187157]: 2025-12-03 00:13:58.271 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:13:58 compute-1 nova_compute[187157]: 2025-12-03 00:13:58.272 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:00 compute-1 podman[217303]: 2025-12-03 00:14:00.202173917 +0000 UTC m=+0.045513629 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:14:00 compute-1 podman[217304]: 2025-12-03 00:14:00.234195423 +0000 UTC m=+0.073827904 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:14:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:14:01.736 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:14:01.737 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:14:01.737 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:03 compute-1 nova_compute[187157]: 2025-12-03 00:14:03.272 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:05 compute-1 podman[197537]: time="2025-12-03T00:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:14:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:14:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2614 "" "Go-http-client/1.1"
Dec 03 00:14:08 compute-1 nova_compute[187157]: 2025-12-03 00:14:08.273 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:08 compute-1 nova_compute[187157]: 2025-12-03 00:14:08.275 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:08 compute-1 nova_compute[187157]: 2025-12-03 00:14:08.275 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:14:08 compute-1 nova_compute[187157]: 2025-12-03 00:14:08.275 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:08 compute-1 nova_compute[187157]: 2025-12-03 00:14:08.301 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:08 compute-1 nova_compute[187157]: 2025-12-03 00:14:08.301 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:13 compute-1 podman[217345]: 2025-12-03 00:14:13.204072359 +0000 UTC m=+0.051681120 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 03 00:14:13 compute-1 nova_compute[187157]: 2025-12-03 00:14:13.302 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:17 compute-1 podman[217366]: 2025-12-03 00:14:17.220296533 +0000 UTC m=+0.057256027 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 03 00:14:18 compute-1 nova_compute[187157]: 2025-12-03 00:14:18.303 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:18 compute-1 nova_compute[187157]: 2025-12-03 00:14:18.304 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: ERROR   00:14:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: ERROR   00:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: ERROR   00:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: ERROR   00:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: ERROR   00:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:14:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:14:23 compute-1 nova_compute[187157]: 2025-12-03 00:14:23.304 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:23 compute-1 nova_compute[187157]: 2025-12-03 00:14:23.306 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:23 compute-1 nova_compute[187157]: 2025-12-03 00:14:23.306 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:14:23 compute-1 nova_compute[187157]: 2025-12-03 00:14:23.306 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:23 compute-1 nova_compute[187157]: 2025-12-03 00:14:23.306 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:23 compute-1 nova_compute[187157]: 2025-12-03 00:14:23.307 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:26 compute-1 podman[217387]: 2025-12-03 00:14:26.212734407 +0000 UTC m=+0.055589067 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:14:28 compute-1 nova_compute[187157]: 2025-12-03 00:14:28.308 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:31 compute-1 podman[217412]: 2025-12-03 00:14:31.22083543 +0000 UTC m=+0.066985936 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true)
Dec 03 00:14:31 compute-1 podman[217413]: 2025-12-03 00:14:31.233282566 +0000 UTC m=+0.075477175 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:14:33 compute-1 nova_compute[187157]: 2025-12-03 00:14:33.309 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:33 compute-1 nova_compute[187157]: 2025-12-03 00:14:33.311 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:33 compute-1 nova_compute[187157]: 2025-12-03 00:14:33.311 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:14:33 compute-1 nova_compute[187157]: 2025-12-03 00:14:33.311 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:33 compute-1 nova_compute[187157]: 2025-12-03 00:14:33.350 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:33 compute-1 nova_compute[187157]: 2025-12-03 00:14:33.351 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:35 compute-1 podman[197537]: time="2025-12-03T00:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:14:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:14:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2609 "" "Go-http-client/1.1"
Dec 03 00:14:38 compute-1 nova_compute[187157]: 2025-12-03 00:14:38.352 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:41 compute-1 nova_compute[187157]: 2025-12-03 00:14:41.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:43 compute-1 nova_compute[187157]: 2025-12-03 00:14:43.352 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:44 compute-1 podman[217456]: 2025-12-03 00:14:44.254622005 +0000 UTC m=+0.097032174 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 03 00:14:44 compute-1 nova_compute[187157]: 2025-12-03 00:14:44.860 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Creating tmpfile /var/lib/nova/instances/tmp763x_1ui to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:14:44 compute-1 nova_compute[187157]: 2025-12-03 00:14:44.861 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:44 compute-1 nova_compute[187157]: 2025-12-03 00:14:44.864 187161 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:14:45 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:14:46 compute-1 nova_compute[187157]: 2025-12-03 00:14:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:46 compute-1 nova_compute[187157]: 2025-12-03 00:14:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:46 compute-1 nova_compute[187157]: 2025-12-03 00:14:46.922 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.215 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.339 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.340 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.358 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.359 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5848MB free_disk=73.16615295410156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.359 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:47 compute-1 nova_compute[187157]: 2025-12-03 00:14:47.359 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:48 compute-1 podman[217479]: 2025-12-03 00:14:48.260180787 +0000 UTC m=+0.088506786 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.354 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.355 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.356 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.356 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.357 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.358 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.909 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 799be56b-eb56-4319-a027-b0fe2cf7991f has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.910 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.910 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:14:47 up  1:21,  0 user,  load average: 0.15, 0.32, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:14:48 compute-1 nova_compute[187157]: 2025-12-03 00:14:48.946 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: ERROR   00:14:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: ERROR   00:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: ERROR   00:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: ERROR   00:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: ERROR   00:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:14:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:14:49 compute-1 nova_compute[187157]: 2025-12-03 00:14:49.455 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:14:49 compute-1 nova_compute[187157]: 2025-12-03 00:14:49.966 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:14:49 compute-1 nova_compute[187157]: 2025-12-03 00:14:49.967 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.608s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:50 compute-1 nova_compute[187157]: 2025-12-03 00:14:50.969 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:51 compute-1 nova_compute[187157]: 2025-12-03 00:14:51.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:52 compute-1 nova_compute[187157]: 2025-12-03 00:14:52.606 187161 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='799be56b-eb56-4319-a027-b0fe2cf7991f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:14:52 compute-1 sshd-session[217499]: Invalid user solana from 193.32.162.146 port 57068
Dec 03 00:14:52 compute-1 sshd-session[217499]: Connection closed by invalid user solana 193.32.162.146 port 57068 [preauth]
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.358 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.360 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.360 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.360 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.391 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.392 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.623 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.624 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:14:53 compute-1 nova_compute[187157]: 2025-12-03 00:14:53.624 187161 DEBUG nova.network.neutron [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:14:54 compute-1 nova_compute[187157]: 2025-12-03 00:14:54.131 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:54 compute-1 nova_compute[187157]: 2025-12-03 00:14:54.637 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:54 compute-1 nova_compute[187157]: 2025-12-03 00:14:54.834 187161 DEBUG nova.network.neutron [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.341 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.360 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='799be56b-eb56-4319-a027-b0fe2cf7991f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.361 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Creating instance directory: /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.362 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Creating disk.info with the contents: {'/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk': 'qcow2', '/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.362 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.363 187161 DEBUG nova.objects.instance [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 799be56b-eb56-4319-a027-b0fe2cf7991f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.869 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.876 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.879 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.953 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.954 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.956 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.957 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.966 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:55 compute-1 nova_compute[187157]: 2025-12-03 00:14:55.967 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.030 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.031 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.066 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.067 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.068 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.117 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.118 187161 DEBUG nova.virt.disk.api [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.119 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.167 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.168 187161 DEBUG nova.virt.disk.api [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.168 187161 DEBUG nova.objects.instance [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 799be56b-eb56-4319-a027-b0fe2cf7991f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.677 187161 DEBUG nova.objects.base [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<799be56b-eb56-4319-a027-b0fe2cf7991f> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.678 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.702 187161 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config 497664" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.704 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.707 187161 DEBUG nova.virt.libvirt.vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:13:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118620355',id=22,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:14:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-103ztj3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:14:10Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=799be56b-eb56-4319-a027-b0fe2cf7991f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.708 187161 DEBUG nova.network.os_vif_util [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.709 187161 DEBUG nova.network.os_vif_util [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.711 187161 DEBUG os_vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.712 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.713 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.714 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.715 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.715 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd80bb76c-ed29-5ef3-a19d-c23d06690b98', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.717 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.721 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.725 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.725 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde4a7aac-87, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.726 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapde4a7aac-87, col_values=(('qos', UUID('fb12a03b-5849-424a-903d-8d68c4da7653')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.727 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapde4a7aac-87, col_values=(('external_ids', {'iface-id': 'de4a7aac-87a1-4237-9c69-504ca4fa7d87', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:29:37', 'vm-uuid': '799be56b-eb56-4319-a027-b0fe2cf7991f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.729 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:56 compute-1 NetworkManager[55553]: <info>  [1764720896.7305] manager: (tapde4a7aac-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.732 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.737 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.738 187161 INFO os_vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87')
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.739 187161 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.740 187161 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='799be56b-eb56-4319-a027-b0fe2cf7991f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:14:56 compute-1 nova_compute[187157]: 2025-12-03 00:14:56.741 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:57 compute-1 podman[217521]: 2025-12-03 00:14:57.219351554 +0000 UTC m=+0.059696396 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:14:57 compute-1 nova_compute[187157]: 2025-12-03 00:14:57.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:57 compute-1 nova_compute[187157]: 2025-12-03 00:14:57.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:14:57 compute-1 nova_compute[187157]: 2025-12-03 00:14:57.806 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:58 compute-1 nova_compute[187157]: 2025-12-03 00:14:58.393 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:58 compute-1 nova_compute[187157]: 2025-12-03 00:14:58.400 187161 DEBUG nova.network.neutron [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Port de4a7aac-87a1-4237-9c69-504ca4fa7d87 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:14:58 compute-1 nova_compute[187157]: 2025-12-03 00:14:58.409 187161 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='799be56b-eb56-4319-a027-b0fe2cf7991f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:14:58 compute-1 nova_compute[187157]: 2025-12-03 00:14:58.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:01 compute-1 nova_compute[187157]: 2025-12-03 00:15:01.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:01.738 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:01.739 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:01.739 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:01 compute-1 nova_compute[187157]: 2025-12-03 00:15:01.766 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:01 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:15:01 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:15:01 compute-1 podman[217546]: 2025-12-03 00:15:01.877275067 +0000 UTC m=+0.050124742 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Dec 03 00:15:01 compute-1 podman[217547]: 2025-12-03 00:15:01.932574185 +0000 UTC m=+0.098153371 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:15:02 compute-1 kernel: tapde4a7aac-87: entered promiscuous mode
Dec 03 00:15:02 compute-1 NetworkManager[55553]: <info>  [1764720902.0269] manager: (tapde4a7aac-87): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec 03 00:15:02 compute-1 ovn_controller[95464]: 2025-12-03T00:15:02Z|00208|binding|INFO|Claiming lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 for this additional chassis.
Dec 03 00:15:02 compute-1 ovn_controller[95464]: 2025-12-03T00:15:02Z|00209|binding|INFO|de4a7aac-87a1-4237-9c69-504ca4fa7d87: Claiming fa:16:3e:da:29:37 10.100.0.8
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.028 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.034 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.037 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.041 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:29:37 10.100.0.8'], port_security=['fa:16:3e:da:29:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '799be56b-eb56-4319-a027-b0fe2cf7991f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=de4a7aac-87a1-4237-9c69-504ca4fa7d87) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.042 104348 INFO neutron.agent.ovn.metadata.agent [-] Port de4a7aac-87a1-4237-9c69-504ca4fa7d87 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.043 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.055 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d298e56d-a7d5-443c-9cfd-8b3e0d277994]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.056 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee60e03c-a1 in ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.058 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee60e03c-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.058 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6bdea94a-315e-4313-a4c0-501377c12d49]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.059 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c94bfd99-81a9-4d48-8889-3270b5b6586d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 systemd-udevd[217629]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:15:02 compute-1 systemd-machined[153454]: New machine qemu-19-instance-00000016.
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.069 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[237da6f0-4b7b-42d6-acd3-289ffe825892]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 NetworkManager[55553]: <info>  [1764720902.0723] device (tapde4a7aac-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:15:02 compute-1 NetworkManager[55553]: <info>  [1764720902.0736] device (tapde4a7aac-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.094 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4de8192a-f39a-4e85-a329-a7d3f419d018]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-00000016.
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.099 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 ovn_controller[95464]: 2025-12-03T00:15:02Z|00210|binding|INFO|Setting lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 ovn-installed in OVS
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.105 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.123 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[133a1608-7a89-48ae-aec5-79afeff558be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.128 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[23d56f28-35ef-478e-b79f-0079d6946c3d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 systemd-udevd[217633]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:15:02 compute-1 NetworkManager[55553]: <info>  [1764720902.1295] manager: (tapee60e03c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.158 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8642fc0e-5698-440c-baff-eaf548142e3c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.160 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[9518c0a6-32f2-4047-9c30-e81b5cfadd33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 NetworkManager[55553]: <info>  [1764720902.1815] device (tapee60e03c-a0): carrier: link connected
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.185 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb285d8-0d06-46a1-8712-e6b581e55ed7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.198 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[391285f9-acb3-4954-9024-94d13206a492]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492183, 'reachable_time': 20508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217661, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.214 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[24745cd4-ea25-4df5-a9f6-1dce99a6d6c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:2bfe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492183, 'tstamp': 492183}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217664, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.233 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab64b7b-5798-4393-a65b-ebbe2307ef13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492183, 'reachable_time': 20508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217668, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.264 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[69bbb670-dc00-45a5-a0a3-4a106ed444f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.315 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3f20e9-2518-42d5-ba14-d79940488b4d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.316 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.316 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.317 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:02 compute-1 kernel: tapee60e03c-a0: entered promiscuous mode
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.318 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 NetworkManager[55553]: <info>  [1764720902.3193] manager: (tapee60e03c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.320 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.321 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.322 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 ovn_controller[95464]: 2025-12-03T00:15:02Z|00211|binding|INFO|Releasing lport 42f0d9e7-7c77-4247-8972-6beac3a53206 from this chassis (sb_readonly=0)
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.324 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ef409c88-a2e3-49f6-a88f-4d55530a7f5b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.325 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.325 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.325 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ee60e03c-ab3a-419f-84ef-62aec4b6b0dd disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.325 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.325 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[875c6cf0-1d41-4a56-834e-4b3043ee2b9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.326 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.326 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[430b1efb-8555-4135-8022-2b835157330d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.326 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:15:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:02.327 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'env', 'PROCESS_TAG=haproxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:15:02 compute-1 nova_compute[187157]: 2025-12-03 00:15:02.334 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:02 compute-1 podman[217701]: 2025-12-03 00:15:02.707468726 +0000 UTC m=+0.045625811 container create 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:15:02 compute-1 systemd[1]: Started libpod-conmon-7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374.scope.
Dec 03 00:15:02 compute-1 podman[217701]: 2025-12-03 00:15:02.681936579 +0000 UTC m=+0.020093684 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:15:02 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:15:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b780e8adc090a55b1381f23cdd9f0fad89db9c9f20a2d1fefd83e3dbc6fa9403/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:15:02 compute-1 podman[217701]: 2025-12-03 00:15:02.833763767 +0000 UTC m=+0.171920872 container init 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:15:02 compute-1 podman[217701]: 2025-12-03 00:15:02.839362885 +0000 UTC m=+0.177519970 container start 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest)
Dec 03 00:15:02 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [NOTICE]   (217734) : New worker (217736) forked
Dec 03 00:15:02 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [NOTICE]   (217734) : Loading success.
Dec 03 00:15:03 compute-1 nova_compute[187157]: 2025-12-03 00:15:03.393 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:04.150 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:04.151 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:15:04 compute-1 nova_compute[187157]: 2025-12-03 00:15:04.151 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:04.152 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:04 compute-1 ovn_controller[95464]: 2025-12-03T00:15:04Z|00212|binding|INFO|Claiming lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 for this chassis.
Dec 03 00:15:04 compute-1 ovn_controller[95464]: 2025-12-03T00:15:04Z|00213|binding|INFO|de4a7aac-87a1-4237-9c69-504ca4fa7d87: Claiming fa:16:3e:da:29:37 10.100.0.8
Dec 03 00:15:04 compute-1 ovn_controller[95464]: 2025-12-03T00:15:04Z|00214|binding|INFO|Setting lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 up in Southbound
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.290 187161 INFO nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Post operation of migration started
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.290 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:05 compute-1 podman[197537]: time="2025-12-03T00:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:15:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:15:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3076 "" "Go-http-client/1.1"
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.832 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.833 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.927 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.927 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:15:05 compute-1 nova_compute[187157]: 2025-12-03 00:15:05.927 187161 DEBUG nova.network.neutron [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:15:06 compute-1 nova_compute[187157]: 2025-12-03 00:15:06.437 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:06 compute-1 nova_compute[187157]: 2025-12-03 00:15:06.768 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:07 compute-1 nova_compute[187157]: 2025-12-03 00:15:07.040 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:07 compute-1 nova_compute[187157]: 2025-12-03 00:15:07.522 187161 DEBUG nova.network.neutron [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:08 compute-1 nova_compute[187157]: 2025-12-03 00:15:08.031 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:15:08 compute-1 nova_compute[187157]: 2025-12-03 00:15:08.395 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:08 compute-1 nova_compute[187157]: 2025-12-03 00:15:08.545 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:08 compute-1 nova_compute[187157]: 2025-12-03 00:15:08.546 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:08 compute-1 nova_compute[187157]: 2025-12-03 00:15:08.546 187161 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:08 compute-1 nova_compute[187157]: 2025-12-03 00:15:08.550 187161 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:15:08 compute-1 virtqemud[186882]: Domain id=19 name='instance-00000016' uuid=799be56b-eb56-4319-a027-b0fe2cf7991f is tainted: custom-monitor
Dec 03 00:15:09 compute-1 nova_compute[187157]: 2025-12-03 00:15:09.555 187161 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:15:10 compute-1 nova_compute[187157]: 2025-12-03 00:15:10.560 187161 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:15:10 compute-1 nova_compute[187157]: 2025-12-03 00:15:10.565 187161 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:15:11 compute-1 nova_compute[187157]: 2025-12-03 00:15:11.074 187161 DEBUG nova.objects.instance [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:15:11 compute-1 nova_compute[187157]: 2025-12-03 00:15:11.770 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:12 compute-1 nova_compute[187157]: 2025-12-03 00:15:12.091 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:12 compute-1 nova_compute[187157]: 2025-12-03 00:15:12.836 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:12 compute-1 nova_compute[187157]: 2025-12-03 00:15:12.837 187161 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:13 compute-1 nova_compute[187157]: 2025-12-03 00:15:13.396 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:15 compute-1 podman[217747]: 2025-12-03 00:15:15.212261919 +0000 UTC m=+0.057295849 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:15:16 compute-1 nova_compute[187157]: 2025-12-03 00:15:16.772 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:18 compute-1 nova_compute[187157]: 2025-12-03 00:15:18.397 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:19 compute-1 podman[217770]: 2025-12-03 00:15:19.22144296 +0000 UTC m=+0.060016455 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: ERROR   00:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: ERROR   00:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: ERROR   00:15:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: ERROR   00:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: ERROR   00:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:15:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:15:21 compute-1 nova_compute[187157]: 2025-12-03 00:15:21.775 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.399 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.779 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.780 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.780 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.780 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.781 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:23 compute-1 nova_compute[187157]: 2025-12-03 00:15:23.799 187161 INFO nova.compute.manager [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Terminating instance
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.319 187161 DEBUG nova.compute.manager [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:15:24 compute-1 kernel: tapde4a7aac-87 (unregistering): left promiscuous mode
Dec 03 00:15:24 compute-1 NetworkManager[55553]: <info>  [1764720924.3533] device (tapde4a7aac-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.359 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 ovn_controller[95464]: 2025-12-03T00:15:24Z|00215|binding|INFO|Releasing lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 from this chassis (sb_readonly=0)
Dec 03 00:15:24 compute-1 ovn_controller[95464]: 2025-12-03T00:15:24Z|00216|binding|INFO|Setting lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 down in Southbound
Dec 03 00:15:24 compute-1 ovn_controller[95464]: 2025-12-03T00:15:24Z|00217|binding|INFO|Removing iface tapde4a7aac-87 ovn-installed in OVS
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.362 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.368 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:29:37 10.100.0.8'], port_security=['fa:16:3e:da:29:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '799be56b-eb56-4319-a027-b0fe2cf7991f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=de4a7aac-87a1-4237-9c69-504ca4fa7d87) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.368 104348 INFO neutron.agent.ovn.metadata.agent [-] Port de4a7aac-87a1-4237-9c69-504ca4fa7d87 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.370 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.372 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[25d3c388-d3f4-47aa-aa98-6ff408d3b4ae]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.372 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace which is not needed anymore
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.378 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 03 00:15:24 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000016.scope: Consumed 1.941s CPU time.
Dec 03 00:15:24 compute-1 systemd-machined[153454]: Machine qemu-19-instance-00000016 terminated.
Dec 03 00:15:24 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [NOTICE]   (217734) : haproxy version is 3.0.5-8e879a5
Dec 03 00:15:24 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [NOTICE]   (217734) : path to executable is /usr/sbin/haproxy
Dec 03 00:15:24 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [WARNING]  (217734) : Exiting Master process...
Dec 03 00:15:24 compute-1 podman[217818]: 2025-12-03 00:15:24.484180147 +0000 UTC m=+0.028595513 container kill 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:15:24 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [ALERT]    (217734) : Current worker (217736) exited with code 143 (Terminated)
Dec 03 00:15:24 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[217716]: [WARNING]  (217734) : All workers exited. Exiting... (0)
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.485 187161 DEBUG nova.compute.manager [req-a0a620c1-ae1c-400e-be37-954e5bef7571 req-d9b4bc98-57e2-4e43-9dcd-aad6c789b3d8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.486 187161 DEBUG oslo_concurrency.lockutils [req-a0a620c1-ae1c-400e-be37-954e5bef7571 req-d9b4bc98-57e2-4e43-9dcd-aad6c789b3d8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.486 187161 DEBUG oslo_concurrency.lockutils [req-a0a620c1-ae1c-400e-be37-954e5bef7571 req-d9b4bc98-57e2-4e43-9dcd-aad6c789b3d8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.487 187161 DEBUG oslo_concurrency.lockutils [req-a0a620c1-ae1c-400e-be37-954e5bef7571 req-d9b4bc98-57e2-4e43-9dcd-aad6c789b3d8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.487 187161 DEBUG nova.compute.manager [req-a0a620c1-ae1c-400e-be37-954e5bef7571 req-d9b4bc98-57e2-4e43-9dcd-aad6c789b3d8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:24 compute-1 systemd[1]: libpod-7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374.scope: Deactivated successfully.
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.487 187161 DEBUG nova.compute.manager [req-a0a620c1-ae1c-400e-be37-954e5bef7571 req-d9b4bc98-57e2-4e43-9dcd-aad6c789b3d8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:24 compute-1 podman[217831]: 2025-12-03 00:15:24.520575291 +0000 UTC m=+0.022079113 container died 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.537 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.541 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374-userdata-shm.mount: Deactivated successfully.
Dec 03 00:15:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-b780e8adc090a55b1381f23cdd9f0fad89db9c9f20a2d1fefd83e3dbc6fa9403-merged.mount: Deactivated successfully.
Dec 03 00:15:24 compute-1 podman[217831]: 2025-12-03 00:15:24.601051408 +0000 UTC m=+0.102555230 container cleanup 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Dec 03 00:15:24 compute-1 systemd[1]: libpod-conmon-7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374.scope: Deactivated successfully.
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.618 187161 INFO nova.virt.libvirt.driver [-] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance destroyed successfully.
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.619 187161 DEBUG nova.objects.instance [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'resources' on Instance uuid 799be56b-eb56-4319-a027-b0fe2cf7991f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:15:24 compute-1 podman[217838]: 2025-12-03 00:15:24.620774732 +0000 UTC m=+0.108758972 container remove 7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.625 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f3622e4b-c4af-4826-acd5-882acdfceaac]: (4, ("Wed Dec  3 12:15:24 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374)\n7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374\nWed Dec  3 12:15:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374)\n7af22851897db0f5d8aabd27f94340f09efa183bfd5d86d30389950655b7a374\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.627 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[160314e4-e2d3-4bb2-9373-61e175c0c8bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.627 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.627 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[15940515-2e4e-436c-b4a9-215bb4a5473d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.628 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.629 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 kernel: tapee60e03c-a0: left promiscuous mode
Dec 03 00:15:24 compute-1 nova_compute[187157]: 2025-12-03 00:15:24.645 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.647 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c9590f-cb8b-4d26-b168-952d67be690a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.660 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[17b98490-cf85-4c13-af05-988d4e8cbe28]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.661 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f28e501e-c67f-4a1e-a0cf-d52ce7c0aa2b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.679 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d8242dbb-6105-48f9-a040-66f48e86d05a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492177, 'reachable_time': 30745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217883, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:24 compute-1 systemd[1]: run-netns-ovnmeta\x2dee60e03c\x2dab3a\x2d419f\x2d84ef\x2d62aec4b6b0dd.mount: Deactivated successfully.
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.681 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:15:24 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:15:24.682 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[67eba324-e4bd-437c-8726-d22c44ef10b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.124 187161 DEBUG nova.virt.libvirt.vif [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:13:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118620355',id=22,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:14:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-103ztj3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:15:11Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=799be56b-eb56-4319-a027-b0fe2cf7991f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.124 187161 DEBUG nova.network.os_vif_util [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.125 187161 DEBUG nova.network.os_vif_util [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.125 187161 DEBUG os_vif [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.126 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.126 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde4a7aac-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.175 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.177 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.178 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.178 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=fb12a03b-5849-424a-903d-8d68c4da7653) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.179 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.180 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.181 187161 INFO os_vif [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87')
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.182 187161 INFO nova.virt.libvirt.driver [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Deleting instance files /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f_del
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.182 187161 INFO nova.virt.libvirt.driver [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Deletion of /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f_del complete
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.693 187161 INFO nova.compute.manager [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Took 1.37 seconds to destroy the instance on the hypervisor.
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.694 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.694 187161 DEBUG nova.compute.manager [-] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.694 187161 DEBUG nova.network.neutron [-] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.695 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:25 compute-1 nova_compute[187157]: 2025-12-03 00:15:25.828 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.231 187161 DEBUG nova.compute.manager [req-06ed4ad3-3bad-4c4c-99ea-180e29be7755 req-9ed9678a-7438-4f52-a18e-e3c95e2f56e7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-deleted-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.231 187161 INFO nova.compute.manager [req-06ed4ad3-3bad-4c4c-99ea-180e29be7755 req-9ed9678a-7438-4f52-a18e-e3c95e2f56e7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Neutron deleted interface de4a7aac-87a1-4237-9c69-504ca4fa7d87; detaching it from the instance and deleting it from the info cache
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.232 187161 DEBUG nova.network.neutron [req-06ed4ad3-3bad-4c4c-99ea-180e29be7755 req-9ed9678a-7438-4f52-a18e-e3c95e2f56e7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.536 187161 DEBUG nova.compute.manager [req-6bfae965-6627-41bb-abe0-d22b36b6281e req-24ad77c6-a726-4ac7-9019-e073a9ba0b78 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.537 187161 DEBUG oslo_concurrency.lockutils [req-6bfae965-6627-41bb-abe0-d22b36b6281e req-24ad77c6-a726-4ac7-9019-e073a9ba0b78 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.537 187161 DEBUG oslo_concurrency.lockutils [req-6bfae965-6627-41bb-abe0-d22b36b6281e req-24ad77c6-a726-4ac7-9019-e073a9ba0b78 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.537 187161 DEBUG oslo_concurrency.lockutils [req-6bfae965-6627-41bb-abe0-d22b36b6281e req-24ad77c6-a726-4ac7-9019-e073a9ba0b78 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.538 187161 DEBUG nova.compute.manager [req-6bfae965-6627-41bb-abe0-d22b36b6281e req-24ad77c6-a726-4ac7-9019-e073a9ba0b78 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.538 187161 DEBUG nova.compute.manager [req-6bfae965-6627-41bb-abe0-d22b36b6281e req-24ad77c6-a726-4ac7-9019-e073a9ba0b78 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.690 187161 DEBUG nova.network.neutron [-] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:26 compute-1 nova_compute[187157]: 2025-12-03 00:15:26.738 187161 DEBUG nova.compute.manager [req-06ed4ad3-3bad-4c4c-99ea-180e29be7755 req-9ed9678a-7438-4f52-a18e-e3c95e2f56e7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Detach interface failed, port_id=de4a7aac-87a1-4237-9c69-504ca4fa7d87, reason: Instance 799be56b-eb56-4319-a027-b0fe2cf7991f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:15:27 compute-1 nova_compute[187157]: 2025-12-03 00:15:27.198 187161 INFO nova.compute.manager [-] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Took 1.50 seconds to deallocate network for instance.
Dec 03 00:15:27 compute-1 nova_compute[187157]: 2025-12-03 00:15:27.718 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:27 compute-1 nova_compute[187157]: 2025-12-03 00:15:27.718 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:27 compute-1 nova_compute[187157]: 2025-12-03 00:15:27.723 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:27 compute-1 nova_compute[187157]: 2025-12-03 00:15:27.798 187161 INFO nova.scheduler.client.report [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Deleted allocations for instance 799be56b-eb56-4319-a027-b0fe2cf7991f
Dec 03 00:15:28 compute-1 podman[217884]: 2025-12-03 00:15:28.214241783 +0000 UTC m=+0.057836461 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:15:28 compute-1 nova_compute[187157]: 2025-12-03 00:15:28.402 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:28 compute-1 nova_compute[187157]: 2025-12-03 00:15:28.828 187161 DEBUG oslo_concurrency.lockutils [None req-70c5ea38-9fe2-481f-b97d-40ca6a5e6e8d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.049s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:30 compute-1 nova_compute[187157]: 2025-12-03 00:15:30.180 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:32 compute-1 podman[217909]: 2025-12-03 00:15:32.244978143 +0000 UTC m=+0.082773584 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:15:32 compute-1 podman[217910]: 2025-12-03 00:15:32.277373259 +0000 UTC m=+0.117315552 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:15:32 compute-1 sshd-session[217954]: Invalid user sol from 45.148.10.240 port 56962
Dec 03 00:15:32 compute-1 sshd-session[217954]: Connection closed by invalid user sol 45.148.10.240 port 56962 [preauth]
Dec 03 00:15:33 compute-1 nova_compute[187157]: 2025-12-03 00:15:33.428 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:35 compute-1 nova_compute[187157]: 2025-12-03 00:15:35.181 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:35 compute-1 podman[197537]: time="2025-12-03T00:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:15:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:15:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2613 "" "Go-http-client/1.1"
Dec 03 00:15:38 compute-1 nova_compute[187157]: 2025-12-03 00:15:38.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:40 compute-1 nova_compute[187157]: 2025-12-03 00:15:40.183 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:42 compute-1 nova_compute[187157]: 2025-12-03 00:15:42.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:43 compute-1 nova_compute[187157]: 2025-12-03 00:15:43.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:45 compute-1 nova_compute[187157]: 2025-12-03 00:15:45.216 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:46 compute-1 podman[217956]: 2025-12-03 00:15:46.225502039 +0000 UTC m=+0.062371562 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:15:47 compute-1 nova_compute[187157]: 2025-12-03 00:15:47.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:48 compute-1 nova_compute[187157]: 2025-12-03 00:15:48.433 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:48 compute-1 nova_compute[187157]: 2025-12-03 00:15:48.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:48 compute-1 nova_compute[187157]: 2025-12-03 00:15:48.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:15:49 compute-1 podman[217978]: 2025-12-03 00:15:49.31238804 +0000 UTC m=+0.060975509 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.362 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.363 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.377 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.378 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5830MB free_disk=73.1661262512207GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.378 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:49 compute-1 nova_compute[187157]: 2025-12-03 00:15:49.378 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: ERROR   00:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: ERROR   00:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: ERROR   00:15:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: ERROR   00:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: ERROR   00:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:15:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:15:50 compute-1 nova_compute[187157]: 2025-12-03 00:15:50.260 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:50 compute-1 nova_compute[187157]: 2025-12-03 00:15:50.426 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:15:50 compute-1 nova_compute[187157]: 2025-12-03 00:15:50.427 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:15:49 up  1:22,  0 user,  load average: 0.05, 0.26, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:15:50 compute-1 nova_compute[187157]: 2025-12-03 00:15:50.452 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:15:50 compute-1 nova_compute[187157]: 2025-12-03 00:15:50.964 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:15:51 compute-1 nova_compute[187157]: 2025-12-03 00:15:51.475 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:15:51 compute-1 nova_compute[187157]: 2025-12-03 00:15:51.475 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:53 compute-1 nova_compute[187157]: 2025-12-03 00:15:53.435 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:55 compute-1 nova_compute[187157]: 2025-12-03 00:15:55.292 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:56 compute-1 nova_compute[187157]: 2025-12-03 00:15:56.471 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:57 compute-1 nova_compute[187157]: 2025-12-03 00:15:57.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:58 compute-1 nova_compute[187157]: 2025-12-03 00:15:58.437 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:58 compute-1 nova_compute[187157]: 2025-12-03 00:15:58.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:58 compute-1 nova_compute[187157]: 2025-12-03 00:15:58.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:15:59 compute-1 podman[217999]: 2025-12-03 00:15:59.201311111 +0000 UTC m=+0.047270692 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:16:00 compute-1 nova_compute[187157]: 2025-12-03 00:16:00.295 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:00 compute-1 nova_compute[187157]: 2025-12-03 00:16:00.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:01.740 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:01.741 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:01.741 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:03 compute-1 podman[218024]: 2025-12-03 00:16:03.205314745 +0000 UTC m=+0.049805014 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:16:03 compute-1 podman[218025]: 2025-12-03 00:16:03.247243904 +0000 UTC m=+0.085461510 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 03 00:16:03 compute-1 nova_compute[187157]: 2025-12-03 00:16:03.438 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-1 nova_compute[187157]: 2025-12-03 00:16:05.297 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-1 podman[197537]: time="2025-12-03T00:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:16:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:16:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:16:08 compute-1 nova_compute[187157]: 2025-12-03 00:16:08.439 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:09.065 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:09 compute-1 nova_compute[187157]: 2025-12-03 00:16:09.066 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:09.066 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:16:10 compute-1 nova_compute[187157]: 2025-12-03 00:16:10.299 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:13 compute-1 nova_compute[187157]: 2025-12-03 00:16:13.440 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:14.068 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:15 compute-1 nova_compute[187157]: 2025-12-03 00:16:15.341 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:17 compute-1 podman[218069]: 2025-12-03 00:16:17.204645455 +0000 UTC m=+0.042369292 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git)
Dec 03 00:16:18 compute-1 nova_compute[187157]: 2025-12-03 00:16:18.442 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: ERROR   00:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: ERROR   00:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: ERROR   00:16:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: ERROR   00:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: ERROR   00:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:16:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:16:20 compute-1 podman[218089]: 2025-12-03 00:16:20.215342271 +0000 UTC m=+0.053936545 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:16:20 compute-1 nova_compute[187157]: 2025-12-03 00:16:20.343 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:23 compute-1 nova_compute[187157]: 2025-12-03 00:16:23.442 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:25 compute-1 nova_compute[187157]: 2025-12-03 00:16:25.345 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:28 compute-1 nova_compute[187157]: 2025-12-03 00:16:28.474 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:29 compute-1 nova_compute[187157]: 2025-12-03 00:16:29.576 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Creating tmpfile /var/lib/nova/instances/tmpzam0o4uu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:16:29 compute-1 nova_compute[187157]: 2025-12-03 00:16:29.577 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:29 compute-1 nova_compute[187157]: 2025-12-03 00:16:29.580 187161 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:16:30 compute-1 podman[218110]: 2025-12-03 00:16:30.199094992 +0000 UTC m=+0.046879862 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:16:30 compute-1 nova_compute[187157]: 2025-12-03 00:16:30.347 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:31 compute-1 nova_compute[187157]: 2025-12-03 00:16:31.637 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:33 compute-1 nova_compute[187157]: 2025-12-03 00:16:33.476 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:34 compute-1 podman[218134]: 2025-12-03 00:16:34.210384945 +0000 UTC m=+0.052167189 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 00:16:34 compute-1 podman[218135]: 2025-12-03 00:16:34.249624609 +0000 UTC m=+0.090070981 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:16:35 compute-1 nova_compute[187157]: 2025-12-03 00:16:35.389 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:35 compute-1 podman[197537]: time="2025-12-03T00:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:16:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:16:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2614 "" "Go-http-client/1.1"
Dec 03 00:16:35 compute-1 nova_compute[187157]: 2025-12-03 00:16:35.960 187161 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:16:36 compute-1 nova_compute[187157]: 2025-12-03 00:16:36.975 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:16:36 compute-1 nova_compute[187157]: 2025-12-03 00:16:36.975 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:16:36 compute-1 nova_compute[187157]: 2025-12-03 00:16:36.975 187161 DEBUG nova.network.neutron [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:16:37 compute-1 nova_compute[187157]: 2025-12-03 00:16:37.593 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:37 compute-1 nova_compute[187157]: 2025-12-03 00:16:37.978 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:38 compute-1 ovn_controller[95464]: 2025-12-03T00:16:38Z|00218|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 03 00:16:38 compute-1 nova_compute[187157]: 2025-12-03 00:16:38.477 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:38 compute-1 nova_compute[187157]: 2025-12-03 00:16:38.914 187161 DEBUG nova.network.neutron [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.420 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.432 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.433 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Creating instance directory: /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.433 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Creating disk.info with the contents: {'/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk': 'qcow2', '/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.434 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.434 187161 DEBUG nova.objects.instance [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.939 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.943 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.944 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.992 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.993 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.994 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.995 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.998 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:39 compute-1 nova_compute[187157]: 2025-12-03 00:16:39.999 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.045 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.046 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.081 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.083 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.083 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.143 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.144 187161 DEBUG nova.virt.disk.api [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.144 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.228 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.229 187161 DEBUG nova.virt.disk.api [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.229 187161 DEBUG nova.objects.instance [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.390 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.736 187161 DEBUG nova.objects.base [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<c1df5044-c7ad-42e6-93bd-4b5a853ab3b8> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.737 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.756 187161 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config 497664" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.757 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.758 187161 DEBUG nova.virt.libvirt.vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:15:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-984503060',id=24,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:15:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-4tk0mv8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:15:49Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.758 187161 DEBUG nova.network.os_vif_util [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.759 187161 DEBUG nova.network.os_vif_util [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.759 187161 DEBUG os_vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.760 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.765 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.765 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.766 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.766 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '31f8fe63-1f27-59b5-b890-f896a59a7ed5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.768 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.769 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.771 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.772 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.773 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75f7bf8b-14, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.773 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap75f7bf8b-14, col_values=(('qos', UUID('15fd200a-0255-49e6-85a5-644e548046bf')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.773 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap75f7bf8b-14, col_values=(('external_ids', {'iface-id': '75f7bf8b-141c-44e2-be3c-1fdae9af1077', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:86:86', 'vm-uuid': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.774 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 NetworkManager[55553]: <info>  [1764721000.7753] manager: (tap75f7bf8b-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.776 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.780 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.781 187161 INFO os_vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14')
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.781 187161 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.782 187161 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.782 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:40 compute-1 nova_compute[187157]: 2025-12-03 00:16:40.877 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:42 compute-1 nova_compute[187157]: 2025-12-03 00:16:42.960 187161 DEBUG nova.network.neutron [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:16:42 compute-1 nova_compute[187157]: 2025-12-03 00:16:42.975 187161 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:16:43 compute-1 nova_compute[187157]: 2025-12-03 00:16:43.527 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:43 compute-1 nova_compute[187157]: 2025-12-03 00:16:43.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:45 compute-1 kernel: tap75f7bf8b-14: entered promiscuous mode
Dec 03 00:16:45 compute-1 NetworkManager[55553]: <info>  [1764721005.6341] manager: (tap75f7bf8b-14): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Dec 03 00:16:45 compute-1 ovn_controller[95464]: 2025-12-03T00:16:45Z|00219|binding|INFO|Claiming lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 for this additional chassis.
Dec 03 00:16:45 compute-1 ovn_controller[95464]: 2025-12-03T00:16:45Z|00220|binding|INFO|75f7bf8b-141c-44e2-be3c-1fdae9af1077: Claiming fa:16:3e:e6:86:86 10.100.0.3
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.648 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 ovn_controller[95464]: 2025-12-03T00:16:45Z|00221|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 ovn-installed in OVS
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.655 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.656 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:86:86 10.100.0.3'], port_security=['fa:16:3e:e6:86:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=75f7bf8b-141c-44e2-be3c-1fdae9af1077) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.657 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.659 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.661 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.674 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[eb14bfbf-82b9-4ef9-9365-b95e390f20ba]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.675 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee60e03c-a1 in ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.677 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee60e03c-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.677 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7c49c82f-76bb-40e3-9dd7-6aa58211083c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.678 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed5fe31-6492-4d62-b52a-1c52c27365e1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 systemd-machined[153454]: New machine qemu-20-instance-00000018.
Dec 03 00:16:45 compute-1 systemd-udevd[218211]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.689 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c25205-dbbe-4a29-b719-1968ae5728bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 NetworkManager[55553]: <info>  [1764721005.6963] device (tap75f7bf8b-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:16:45 compute-1 NetworkManager[55553]: <info>  [1764721005.6969] device (tap75f7bf8b-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:16:45 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-00000018.
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.705 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bef75a5f-a002-49be-944c-dc42c1f7538b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.728 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ecb1e8-ec34-4b98-ade2-6360f9dc4b3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 NetworkManager[55553]: <info>  [1764721005.7339] manager: (tapee60e03c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.733 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[18193fed-c38e-40b6-a661-301bf11da652]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.764 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a344d6-d6e4-4cd1-9a1f-ac8ff80bc81d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.767 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b3da77-00f5-49a1-a143-3423d4b4e16d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.774 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 NetworkManager[55553]: <info>  [1764721005.7839] device (tapee60e03c-a0): carrier: link connected
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.788 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec769cb-1d8c-404e-b670-21a90bc9303a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.801 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[df012a39-a988-4bfe-806e-54baade36e2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502544, 'reachable_time': 34817, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218243, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.814 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7497e3-9b04-450a-8240-5b2ef29d8438]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:2bfe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502544, 'tstamp': 502544}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218246, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.830 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[22f26b30-5d13-407a-a828-83c15545a3ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502544, 'reachable_time': 34817, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218251, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.856 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ad25e371-4317-44b5-88ce-6126f722d55e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.913 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[db04fc78-349f-4a9a-b52d-8f8e338419db]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.915 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.915 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.916 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:45 compute-1 kernel: tapee60e03c-a0: entered promiscuous mode
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.917 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 NetworkManager[55553]: <info>  [1764721005.9192] manager: (tapee60e03c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.919 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.920 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 ovn_controller[95464]: 2025-12-03T00:16:45Z|00222|binding|INFO|Releasing lport 42f0d9e7-7c77-4247-8972-6beac3a53206 from this chassis (sb_readonly=0)
Dec 03 00:16:45 compute-1 nova_compute[187157]: 2025-12-03 00:16:45.932 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.933 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d7934ef4-91ab-45a3-a173-96f233655d8e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.934 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.934 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.934 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ee60e03c-ab3a-419f-84ef-62aec4b6b0dd disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.934 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.935 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[914a8c15-c3e0-4d71-a62d-a735a326cd29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.935 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.935 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[be18593c-ca0c-416e-967c-d3013b734f90]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.936 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:16:45 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:16:45.936 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'env', 'PROCESS_TAG=haproxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:16:46 compute-1 podman[218284]: 2025-12-03 00:16:46.312318327 +0000 UTC m=+0.050695553 container create e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 03 00:16:46 compute-1 systemd[1]: Started libpod-conmon-e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf.scope.
Dec 03 00:16:46 compute-1 podman[218284]: 2025-12-03 00:16:46.284346937 +0000 UTC m=+0.022724143 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:16:46 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:16:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14b770b258dbfe5d4bdcf538762ecd27513f87192eecaf482dc6f425f91f49f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:16:46 compute-1 podman[218284]: 2025-12-03 00:16:46.406798193 +0000 UTC m=+0.145175459 container init e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:16:46 compute-1 podman[218284]: 2025-12-03 00:16:46.413760102 +0000 UTC m=+0.152137328 container start e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true)
Dec 03 00:16:46 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [NOTICE]   (218304) : New worker (218306) forked
Dec 03 00:16:46 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [NOTICE]   (218304) : Loading success.
Dec 03 00:16:47 compute-1 nova_compute[187157]: 2025-12-03 00:16:47.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:48 compute-1 podman[218319]: 2025-12-03 00:16:48.251703832 +0000 UTC m=+0.087320062 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Dec 03 00:16:48 compute-1 ovn_controller[95464]: 2025-12-03T00:16:48Z|00223|binding|INFO|Claiming lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 for this chassis.
Dec 03 00:16:48 compute-1 ovn_controller[95464]: 2025-12-03T00:16:48Z|00224|binding|INFO|75f7bf8b-141c-44e2-be3c-1fdae9af1077: Claiming fa:16:3e:e6:86:86 10.100.0.3
Dec 03 00:16:48 compute-1 ovn_controller[95464]: 2025-12-03T00:16:48Z|00225|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 up in Southbound
Dec 03 00:16:48 compute-1 nova_compute[187157]: 2025-12-03 00:16:48.527 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-1 nova_compute[187157]: 2025-12-03 00:16:48.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:48 compute-1 nova_compute[187157]: 2025-12-03 00:16:48.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:49 compute-1 nova_compute[187157]: 2025-12-03 00:16:49.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:49 compute-1 nova_compute[187157]: 2025-12-03 00:16:49.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:49 compute-1 nova_compute[187157]: 2025-12-03 00:16:49.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:49 compute-1 nova_compute[187157]: 2025-12-03 00:16:49.213 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: ERROR   00:16:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: ERROR   00:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: ERROR   00:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: ERROR   00:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: ERROR   00:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:16:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:16:49 compute-1 nova_compute[187157]: 2025-12-03 00:16:49.843 187161 INFO nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Post operation of migration started
Dec 03 00:16:49 compute-1 nova_compute[187157]: 2025-12-03 00:16:49.843 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.253 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.305 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.306 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.360 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.498 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.500 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.517 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.518 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5611MB free_disk=73.13694763183594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.519 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.519 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.776 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.858 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:50 compute-1 nova_compute[187157]: 2025-12-03 00:16:50.859 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:51 compute-1 podman[218348]: 2025-12-03 00:16:51.214414265 +0000 UTC m=+0.062545500 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:16:51 compute-1 nova_compute[187157]: 2025-12-03 00:16:51.537 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:16:51 compute-1 nova_compute[187157]: 2025-12-03 00:16:51.855 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:16:51 compute-1 nova_compute[187157]: 2025-12-03 00:16:51.856 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:16:51 compute-1 nova_compute[187157]: 2025-12-03 00:16:51.856 187161 DEBUG nova.network.neutron [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:16:52 compute-1 nova_compute[187157]: 2025-12-03 00:16:52.126 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating resource usage from migration 1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4
Dec 03 00:16:52 compute-1 nova_compute[187157]: 2025-12-03 00:16:52.126 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Starting to track incoming migration 1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4 with flavor 961ca853-f9ec-479e-bfb6-9bdd23ae3e33 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:16:52 compute-1 nova_compute[187157]: 2025-12-03 00:16:52.386 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.181 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}.
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.182 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.182 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:16:50 up  1:23,  0 user,  load average: 0.24, 0.26, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.213 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.530 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.728 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:16:53 compute-1 nova_compute[187157]: 2025-12-03 00:16:53.912 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:54 compute-1 nova_compute[187157]: 2025-12-03 00:16:54.078 187161 DEBUG nova.network.neutron [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:16:54 compute-1 nova_compute[187157]: 2025-12-03 00:16:54.238 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:16:54 compute-1 nova_compute[187157]: 2025-12-03 00:16:54.238 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.719s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:54 compute-1 nova_compute[187157]: 2025-12-03 00:16:54.585 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:16:55 compute-1 nova_compute[187157]: 2025-12-03 00:16:55.104 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:55 compute-1 nova_compute[187157]: 2025-12-03 00:16:55.105 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:55 compute-1 nova_compute[187157]: 2025-12-03 00:16:55.106 187161 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:55 compute-1 nova_compute[187157]: 2025-12-03 00:16:55.110 187161 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:16:55 compute-1 virtqemud[186882]: Domain id=20 name='instance-00000018' uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 is tainted: custom-monitor
Dec 03 00:16:55 compute-1 nova_compute[187157]: 2025-12-03 00:16:55.779 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:56 compute-1 nova_compute[187157]: 2025-12-03 00:16:56.117 187161 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:16:57 compute-1 nova_compute[187157]: 2025-12-03 00:16:57.123 187161 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:16:57 compute-1 nova_compute[187157]: 2025-12-03 00:16:57.128 187161 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:16:57 compute-1 nova_compute[187157]: 2025-12-03 00:16:57.641 187161 DEBUG nova.objects.instance [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:16:58 compute-1 nova_compute[187157]: 2025-12-03 00:16:58.531 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:58 compute-1 nova_compute[187157]: 2025-12-03 00:16:58.658 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:58 compute-1 nova_compute[187157]: 2025-12-03 00:16:58.803 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:58 compute-1 nova_compute[187157]: 2025-12-03 00:16:58.804 187161 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:00 compute-1 nova_compute[187157]: 2025-12-03 00:17:00.783 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-1 podman[218369]: 2025-12-03 00:17:01.217448076 +0000 UTC m=+0.056591927 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:17:01 compute-1 nova_compute[187157]: 2025-12-03 00:17:01.234 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:01 compute-1 nova_compute[187157]: 2025-12-03 00:17:01.235 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:01 compute-1 nova_compute[187157]: 2025-12-03 00:17:01.235 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:01 compute-1 nova_compute[187157]: 2025-12-03 00:17:01.235 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:17:01 compute-1 nova_compute[187157]: 2025-12-03 00:17:01.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:01.741 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:01.742 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:01.742 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:02 compute-1 nova_compute[187157]: 2025-12-03 00:17:02.205 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:03 compute-1 nova_compute[187157]: 2025-12-03 00:17:03.533 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:05 compute-1 podman[218397]: 2025-12-03 00:17:05.226442372 +0000 UTC m=+0.073995429 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:17:05 compute-1 podman[218398]: 2025-12-03 00:17:05.255258842 +0000 UTC m=+0.099058538 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:17:05 compute-1 podman[197537]: time="2025-12-03T00:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:17:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:17:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Dec 03 00:17:05 compute-1 nova_compute[187157]: 2025-12-03 00:17:05.786 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.535 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.915 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.916 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.916 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.916 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.917 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:08 compute-1 nova_compute[187157]: 2025-12-03 00:17:08.930 187161 INFO nova.compute.manager [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Terminating instance
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.448 187161 DEBUG nova.compute.manager [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:17:09 compute-1 kernel: tap75f7bf8b-14 (unregistering): left promiscuous mode
Dec 03 00:17:09 compute-1 NetworkManager[55553]: <info>  [1764721029.4726] device (tap75f7bf8b-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:17:09 compute-1 ovn_controller[95464]: 2025-12-03T00:17:09Z|00226|binding|INFO|Releasing lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 from this chassis (sb_readonly=0)
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.476 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 ovn_controller[95464]: 2025-12-03T00:17:09Z|00227|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 down in Southbound
Dec 03 00:17:09 compute-1 ovn_controller[95464]: 2025-12-03T00:17:09Z|00228|binding|INFO|Removing iface tap75f7bf8b-14 ovn-installed in OVS
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.478 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.484 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:86:86 10.100.0.3'], port_security=['fa:16:3e:e6:86:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '17', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=75f7bf8b-141c-44e2-be3c-1fdae9af1077) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.485 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.485 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.486 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c273a9bd-c9a1-44a0-bc3f-5135f1bdbfc5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.487 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace which is not needed anymore
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.506 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec 03 00:17:09 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000018.scope: Consumed 2.325s CPU time.
Dec 03 00:17:09 compute-1 systemd-machined[153454]: Machine qemu-20-instance-00000018 terminated.
Dec 03 00:17:09 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [NOTICE]   (218304) : haproxy version is 3.0.5-8e879a5
Dec 03 00:17:09 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [NOTICE]   (218304) : path to executable is /usr/sbin/haproxy
Dec 03 00:17:09 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [WARNING]  (218304) : Exiting Master process...
Dec 03 00:17:09 compute-1 podman[218464]: 2025-12-03 00:17:09.595444027 +0000 UTC m=+0.025825288 container kill e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:17:09 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [ALERT]    (218304) : Current worker (218306) exited with code 143 (Terminated)
Dec 03 00:17:09 compute-1 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218300]: [WARNING]  (218304) : All workers exited. Exiting... (0)
Dec 03 00:17:09 compute-1 systemd[1]: libpod-e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf.scope: Deactivated successfully.
Dec 03 00:17:09 compute-1 conmon[218300]: conmon e98fc35fb88b0c267d1e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf.scope/container/memory.events
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.607 187161 DEBUG nova.compute.manager [req-a96a2d90-3aed-4878-8060-d4cc5e926ce8 req-f2aac677-c995-41ae-b810-e97cf4b9bfed 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.608 187161 DEBUG oslo_concurrency.lockutils [req-a96a2d90-3aed-4878-8060-d4cc5e926ce8 req-f2aac677-c995-41ae-b810-e97cf4b9bfed 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.608 187161 DEBUG oslo_concurrency.lockutils [req-a96a2d90-3aed-4878-8060-d4cc5e926ce8 req-f2aac677-c995-41ae-b810-e97cf4b9bfed 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.608 187161 DEBUG oslo_concurrency.lockutils [req-a96a2d90-3aed-4878-8060-d4cc5e926ce8 req-f2aac677-c995-41ae-b810-e97cf4b9bfed 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.608 187161 DEBUG nova.compute.manager [req-a96a2d90-3aed-4878-8060-d4cc5e926ce8 req-f2aac677-c995-41ae-b810-e97cf4b9bfed 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.609 187161 DEBUG nova.compute.manager [req-a96a2d90-3aed-4878-8060-d4cc5e926ce8 req-f2aac677-c995-41ae-b810-e97cf4b9bfed 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:17:09 compute-1 podman[218480]: 2025-12-03 00:17:09.640170224 +0000 UTC m=+0.023625105 container died e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:17:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf-userdata-shm.mount: Deactivated successfully.
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.668 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-14b770b258dbfe5d4bdcf538762ecd27513f87192eecaf482dc6f425f91f49f5-merged.mount: Deactivated successfully.
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.671 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 podman[218480]: 2025-12-03 00:17:09.673264138 +0000 UTC m=+0.056719019 container cleanup e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:17:09 compute-1 systemd[1]: libpod-conmon-e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf.scope: Deactivated successfully.
Dec 03 00:17:09 compute-1 podman[218481]: 2025-12-03 00:17:09.699391103 +0000 UTC m=+0.078489159 container remove e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.704 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[61432fed-a597-487b-b022-7d2e8aacf8ec]: (4, ("Wed Dec  3 12:17:09 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf)\ne98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf\nWed Dec  3 12:17:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (e98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf)\ne98fc35fb88b0c267d1e838712c3d5e2882277656e5663dd5abcc11954aa83bf\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.705 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b9104793-069f-4bc9-8a61-6c79f63f8366]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.706 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.707 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bb15db91-185e-4769-bca6-0c594e97a1d8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.707 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.708 187161 INFO nova.virt.libvirt.driver [-] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance destroyed successfully.
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.708 187161 DEBUG nova.objects.instance [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'resources' on Instance uuid c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.709 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 kernel: tapee60e03c-a0: left promiscuous mode
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.724 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.727 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0a2a7bd3-b645-4f16-a0d3-8ebe4f850c37]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.748 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5f1969-82b7-4353-ad3b-21afa6d9b572]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.749 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[be2dcd92-df37-4fef-ad28-f04d05963e0f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.761 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[355a63aa-df71-4144-9b20-02f9a46abb32]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502538, 'reachable_time': 43531, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218527, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.763 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.763 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[b4212810-312c-4b03-aa6b-9dbe688a4aad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:09 compute-1 systemd[1]: run-netns-ovnmeta\x2dee60e03c\x2dab3a\x2d419f\x2d84ef\x2d62aec4b6b0dd.mount: Deactivated successfully.
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.970 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:09.971 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:17:09 compute-1 nova_compute[187157]: 2025-12-03 00:17:09.971 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.217 187161 DEBUG nova.virt.libvirt.vif [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:15:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-984503060',id=24,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:15:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-4tk0mv8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:16:58Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.217 187161 DEBUG nova.network.os_vif_util [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.218 187161 DEBUG nova.network.os_vif_util [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.218 187161 DEBUG os_vif [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.219 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.220 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75f7bf8b-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.221 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.222 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.223 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.223 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=15fd200a-0255-49e6-85a5-644e548046bf) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.224 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.225 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.227 187161 INFO os_vif [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14')
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.227 187161 INFO nova.virt.libvirt.driver [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Deleting instance files /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8_del
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.228 187161 INFO nova.virt.libvirt.driver [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Deletion of /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8_del complete
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.741 187161 INFO nova.compute.manager [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.742 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.742 187161 DEBUG nova.compute.manager [-] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.742 187161 DEBUG nova.network.neutron [-] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:17:10 compute-1 nova_compute[187157]: 2025-12-03 00:17:10.742 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.688 187161 DEBUG nova.compute.manager [req-95929669-5fd1-4ebc-b8ba-cdc2d736c51a req-bb1018dd-63f5-4ce8-9fc5-a5d95aa7f239 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.688 187161 DEBUG oslo_concurrency.lockutils [req-95929669-5fd1-4ebc-b8ba-cdc2d736c51a req-bb1018dd-63f5-4ce8-9fc5-a5d95aa7f239 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.688 187161 DEBUG oslo_concurrency.lockutils [req-95929669-5fd1-4ebc-b8ba-cdc2d736c51a req-bb1018dd-63f5-4ce8-9fc5-a5d95aa7f239 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.688 187161 DEBUG oslo_concurrency.lockutils [req-95929669-5fd1-4ebc-b8ba-cdc2d736c51a req-bb1018dd-63f5-4ce8-9fc5-a5d95aa7f239 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.689 187161 DEBUG nova.compute.manager [req-95929669-5fd1-4ebc-b8ba-cdc2d736c51a req-bb1018dd-63f5-4ce8-9fc5-a5d95aa7f239 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.689 187161 DEBUG nova.compute.manager [req-95929669-5fd1-4ebc-b8ba-cdc2d736c51a req-bb1018dd-63f5-4ce8-9fc5-a5d95aa7f239 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:17:11 compute-1 nova_compute[187157]: 2025-12-03 00:17:11.845 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:12 compute-1 nova_compute[187157]: 2025-12-03 00:17:12.334 187161 DEBUG nova.compute.manager [req-c9a96592-64cb-4523-bd5f-9d5096cb9e17 req-64b10fa7-6ffc-4130-9238-8d28598c8ab8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-deleted-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:12 compute-1 nova_compute[187157]: 2025-12-03 00:17:12.335 187161 INFO nova.compute.manager [req-c9a96592-64cb-4523-bd5f-9d5096cb9e17 req-64b10fa7-6ffc-4130-9238-8d28598c8ab8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Neutron deleted interface 75f7bf8b-141c-44e2-be3c-1fdae9af1077; detaching it from the instance and deleting it from the info cache
Dec 03 00:17:12 compute-1 nova_compute[187157]: 2025-12-03 00:17:12.335 187161 DEBUG nova.network.neutron [req-c9a96592-64cb-4523-bd5f-9d5096cb9e17 req-64b10fa7-6ffc-4130-9238-8d28598c8ab8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:17:12 compute-1 nova_compute[187157]: 2025-12-03 00:17:12.788 187161 DEBUG nova.network.neutron [-] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:17:12 compute-1 nova_compute[187157]: 2025-12-03 00:17:12.841 187161 DEBUG nova.compute.manager [req-c9a96592-64cb-4523-bd5f-9d5096cb9e17 req-64b10fa7-6ffc-4130-9238-8d28598c8ab8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Detach interface failed, port_id=75f7bf8b-141c-44e2-be3c-1fdae9af1077, reason: Instance c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:17:13 compute-1 nova_compute[187157]: 2025-12-03 00:17:13.295 187161 INFO nova.compute.manager [-] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Took 2.55 seconds to deallocate network for instance.
Dec 03 00:17:13 compute-1 nova_compute[187157]: 2025-12-03 00:17:13.537 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:13 compute-1 nova_compute[187157]: 2025-12-03 00:17:13.809 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:13 compute-1 nova_compute[187157]: 2025-12-03 00:17:13.810 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:13 compute-1 nova_compute[187157]: 2025-12-03 00:17:13.814 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:13 compute-1 nova_compute[187157]: 2025-12-03 00:17:13.855 187161 INFO nova.scheduler.client.report [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Deleted allocations for instance c1df5044-c7ad-42e6-93bd-4b5a853ab3b8
Dec 03 00:17:14 compute-1 nova_compute[187157]: 2025-12-03 00:17:14.884 187161 DEBUG oslo_concurrency.lockutils [None req-c1d49973-04b9-4031-ada0-be74dd77254d 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.968s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:15 compute-1 nova_compute[187157]: 2025-12-03 00:17:15.224 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:18 compute-1 nova_compute[187157]: 2025-12-03 00:17:18.539 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:19 compute-1 podman[218530]: 2025-12-03 00:17:19.224127129 +0000 UTC m=+0.069385606 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, architecture=x86_64)
Dec 03 00:17:19 compute-1 nova_compute[187157]: 2025-12-03 00:17:19.279 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: ERROR   00:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: ERROR   00:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: ERROR   00:17:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: ERROR   00:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: ERROR   00:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:17:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:17:19 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:19.972 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:20 compute-1 nova_compute[187157]: 2025-12-03 00:17:20.227 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:22 compute-1 podman[218554]: 2025-12-03 00:17:22.245855626 +0000 UTC m=+0.077455794 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd)
Dec 03 00:17:23 compute-1 nova_compute[187157]: 2025-12-03 00:17:23.541 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:24 compute-1 sshd-session[218574]: Invalid user solana from 45.148.10.240 port 48732
Dec 03 00:17:24 compute-1 sshd-session[218574]: Connection closed by invalid user solana 45.148.10.240 port 48732 [preauth]
Dec 03 00:17:25 compute-1 nova_compute[187157]: 2025-12-03 00:17:25.229 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:28 compute-1 nova_compute[187157]: 2025-12-03 00:17:28.584 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:30 compute-1 nova_compute[187157]: 2025-12-03 00:17:30.231 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:31 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:31.958 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:96:25 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3da51bfd7f1c491b839f6b6b49056c8b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636cd919-869d-4a8a-92fa-ec7c18804da5) old=Port_Binding(mac=['fa:16:3e:70:96:25'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3da51bfd7f1c491b839f6b6b49056c8b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:31 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:31.959 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636cd919-869d-4a8a-92fa-ec7c18804da5 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 updated
Dec 03 00:17:31 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:31.960 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:17:31 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:31.960 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2b56190e-b1b4-41fc-8fb4-d017a76b2cca]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:32 compute-1 podman[218576]: 2025-12-03 00:17:32.206789362 +0000 UTC m=+0.049008773 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:17:33 compute-1 nova_compute[187157]: 2025-12-03 00:17:33.586 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:35 compute-1 nova_compute[187157]: 2025-12-03 00:17:35.233 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:35 compute-1 podman[197537]: time="2025-12-03T00:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:17:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:17:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2612 "" "Go-http-client/1.1"
Dec 03 00:17:36 compute-1 podman[218600]: 2025-12-03 00:17:36.216848464 +0000 UTC m=+0.047410044 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Dec 03 00:17:36 compute-1 podman[218601]: 2025-12-03 00:17:36.278520394 +0000 UTC m=+0.105647551 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Dec 03 00:17:38 compute-1 nova_compute[187157]: 2025-12-03 00:17:38.589 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:40 compute-1 nova_compute[187157]: 2025-12-03 00:17:40.234 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:41 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:41.223 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:6b:b4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6792b8fb-c596-438f-8f0f-cceaba427dae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=45d438f6-104b-45ed-8931-ccdd86402201) old=Port_Binding(mac=['fa:16:3e:a0:6b:b4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:41 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:41.224 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 45d438f6-104b-45ed-8931-ccdd86402201 in datapath 296b09f4-618a-4795-9eb9-f83709052164 updated
Dec 03 00:17:41 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:41.225 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 296b09f4-618a-4795-9eb9-f83709052164, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:17:41 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:17:41.226 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ff07fcba-d4ea-4eae-861b-e4362be7ba23]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:43 compute-1 nova_compute[187157]: 2025-12-03 00:17:43.590 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:44 compute-1 nova_compute[187157]: 2025-12-03 00:17:44.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:45 compute-1 nova_compute[187157]: 2025-12-03 00:17:45.237 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:45 compute-1 nova_compute[187157]: 2025-12-03 00:17:45.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:48 compute-1 nova_compute[187157]: 2025-12-03 00:17:48.592 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:49 compute-1 nova_compute[187157]: 2025-12-03 00:17:49.210 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:49 compute-1 nova_compute[187157]: 2025-12-03 00:17:49.211 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: ERROR   00:17:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: ERROR   00:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: ERROR   00:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: ERROR   00:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: ERROR   00:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:17:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:17:49 compute-1 nova_compute[187157]: 2025-12-03 00:17:49.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:50 compute-1 podman[218642]: 2025-12-03 00:17:50.219562687 +0000 UTC m=+0.061981818 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.220 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.221 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.239 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.367 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.368 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.394 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.395 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5839MB free_disk=73.16609954833984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.395 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:50 compute-1 nova_compute[187157]: 2025-12-03 00:17:50.395 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:51 compute-1 nova_compute[187157]: 2025-12-03 00:17:51.437 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:17:51 compute-1 nova_compute[187157]: 2025-12-03 00:17:51.438 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:17:50 up  1:24,  0 user,  load average: 0.09, 0.21, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:17:51 compute-1 nova_compute[187157]: 2025-12-03 00:17:51.454 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:17:51 compute-1 nova_compute[187157]: 2025-12-03 00:17:51.959 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:17:52 compute-1 ovn_controller[95464]: 2025-12-03T00:17:52Z|00229|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:17:52 compute-1 nova_compute[187157]: 2025-12-03 00:17:52.467 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:17:52 compute-1 nova_compute[187157]: 2025-12-03 00:17:52.467 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:52 compute-1 nova_compute[187157]: 2025-12-03 00:17:52.467 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:52 compute-1 nova_compute[187157]: 2025-12-03 00:17:52.468 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:17:53 compute-1 podman[218665]: 2025-12-03 00:17:53.253633861 +0000 UTC m=+0.091066095 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:17:53 compute-1 nova_compute[187157]: 2025-12-03 00:17:53.593 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:55 compute-1 nova_compute[187157]: 2025-12-03 00:17:55.206 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:55 compute-1 nova_compute[187157]: 2025-12-03 00:17:55.207 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:17:55 compute-1 nova_compute[187157]: 2025-12-03 00:17:55.241 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:55 compute-1 nova_compute[187157]: 2025-12-03 00:17:55.716 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:17:57 compute-1 nova_compute[187157]: 2025-12-03 00:17:57.206 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:58 compute-1 nova_compute[187157]: 2025-12-03 00:17:58.629 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:58 compute-1 nova_compute[187157]: 2025-12-03 00:17:58.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:00 compute-1 nova_compute[187157]: 2025-12-03 00:18:00.243 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-1 nova_compute[187157]: 2025-12-03 00:18:00.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:00 compute-1 nova_compute[187157]: 2025-12-03 00:18:00.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:18:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:18:01.743 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:18:01.743 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:18:01.744 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:02 compute-1 nova_compute[187157]: 2025-12-03 00:18:02.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:03 compute-1 podman[218686]: 2025-12-03 00:18:03.202567746 +0000 UTC m=+0.051053571 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:18:03 compute-1 nova_compute[187157]: 2025-12-03 00:18:03.632 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:05 compute-1 nova_compute[187157]: 2025-12-03 00:18:05.244 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:05 compute-1 podman[197537]: time="2025-12-03T00:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:18:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:18:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:18:06 compute-1 sshd-session[218710]: Invalid user solana from 193.32.162.146 port 44348
Dec 03 00:18:06 compute-1 podman[218712]: 2025-12-03 00:18:06.791721751 +0000 UTC m=+0.056234638 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:18:06 compute-1 podman[218713]: 2025-12-03 00:18:06.817659731 +0000 UTC m=+0.078388566 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:18:06 compute-1 sshd-session[218710]: Connection closed by invalid user solana 193.32.162.146 port 44348 [preauth]
Dec 03 00:18:08 compute-1 nova_compute[187157]: 2025-12-03 00:18:08.632 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:10 compute-1 nova_compute[187157]: 2025-12-03 00:18:10.246 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:13 compute-1 nova_compute[187157]: 2025-12-03 00:18:13.635 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:15 compute-1 nova_compute[187157]: 2025-12-03 00:18:15.248 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:18 compute-1 nova_compute[187157]: 2025-12-03 00:18:18.682 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: ERROR   00:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: ERROR   00:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: ERROR   00:18:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: ERROR   00:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: ERROR   00:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:18:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:18:20 compute-1 nova_compute[187157]: 2025-12-03 00:18:20.251 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:21 compute-1 podman[218758]: 2025-12-03 00:18:21.223797233 +0000 UTC m=+0.069112410 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:18:23 compute-1 nova_compute[187157]: 2025-12-03 00:18:23.682 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:24 compute-1 podman[218781]: 2025-12-03 00:18:24.238262193 +0000 UTC m=+0.079319638 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:18:25 compute-1 nova_compute[187157]: 2025-12-03 00:18:25.253 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-1 nova_compute[187157]: 2025-12-03 00:18:28.698 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:30 compute-1 nova_compute[187157]: 2025-12-03 00:18:30.254 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:18:32.059 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:18:32 compute-1 nova_compute[187157]: 2025-12-03 00:18:32.060 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:32 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:18:32.061 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:18:33 compute-1 nova_compute[187157]: 2025-12-03 00:18:33.699 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:34 compute-1 podman[218802]: 2025-12-03 00:18:34.223179593 +0000 UTC m=+0.060693647 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:18:35 compute-1 nova_compute[187157]: 2025-12-03 00:18:35.255 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:35 compute-1 podman[197537]: time="2025-12-03T00:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:18:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:18:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2615 "" "Go-http-client/1.1"
Dec 03 00:18:37 compute-1 podman[218827]: 2025-12-03 00:18:37.215604128 +0000 UTC m=+0.061997768 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:18:37 compute-1 podman[218828]: 2025-12-03 00:18:37.252575756 +0000 UTC m=+0.091478344 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Dec 03 00:18:38 compute-1 nova_compute[187157]: 2025-12-03 00:18:38.702 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:39 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:18:39.062 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:40 compute-1 nova_compute[187157]: 2025-12-03 00:18:40.257 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:43 compute-1 nova_compute[187157]: 2025-12-03 00:18:43.704 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:45 compute-1 nova_compute[187157]: 2025-12-03 00:18:45.259 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:46 compute-1 nova_compute[187157]: 2025-12-03 00:18:46.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:47 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:18:48 compute-1 nova_compute[187157]: 2025-12-03 00:18:48.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:48 compute-1 nova_compute[187157]: 2025-12-03 00:18:48.706 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: ERROR   00:18:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: ERROR   00:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: ERROR   00:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: ERROR   00:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: ERROR   00:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:18:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:18:49 compute-1 nova_compute[187157]: 2025-12-03 00:18:49.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:49 compute-1 nova_compute[187157]: 2025-12-03 00:18:49.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.260 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.265 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.266 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.266 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.267 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.417 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.419 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.450 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.451 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5833MB free_disk=73.16510009765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.451 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:50 compute-1 nova_compute[187157]: 2025-12-03 00:18:50.451 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.563 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.564 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:18:50 up  1:25,  0 user,  load average: 0.03, 0.17, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.586 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.706 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.706 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.721 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.740 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:18:51 compute-1 nova_compute[187157]: 2025-12-03 00:18:51.766 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:18:52 compute-1 podman[218870]: 2025-12-03 00:18:52.208896898 +0000 UTC m=+0.048589561 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350)
Dec 03 00:18:52 compute-1 nova_compute[187157]: 2025-12-03 00:18:52.273 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:18:52 compute-1 nova_compute[187157]: 2025-12-03 00:18:52.783 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:18:52 compute-1 nova_compute[187157]: 2025-12-03 00:18:52.783 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.332s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:53 compute-1 nova_compute[187157]: 2025-12-03 00:18:53.707 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:55 compute-1 podman[218891]: 2025-12-03 00:18:55.223640864 +0000 UTC m=+0.065365378 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:18:55 compute-1 nova_compute[187157]: 2025-12-03 00:18:55.305 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:58 compute-1 nova_compute[187157]: 2025-12-03 00:18:58.710 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:59 compute-1 nova_compute[187157]: 2025-12-03 00:18:59.780 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:00 compute-1 nova_compute[187157]: 2025-12-03 00:19:00.308 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:00 compute-1 nova_compute[187157]: 2025-12-03 00:19:00.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:01.744 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:01.744 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:01.745 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:02 compute-1 nova_compute[187157]: 2025-12-03 00:19:02.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:02 compute-1 nova_compute[187157]: 2025-12-03 00:19:02.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:02 compute-1 nova_compute[187157]: 2025-12-03 00:19:02.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.446 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Creating tmpfile /var/lib/nova/instances/tmpazvfqycv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.446 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.449 187161 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.456 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Creating tmpfile /var/lib/nova/instances/tmpi5zwnlar to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.457 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.461 187161 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:19:03 compute-1 nova_compute[187157]: 2025-12-03 00:19:03.711 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:05 compute-1 podman[218912]: 2025-12-03 00:19:05.211448474 +0000 UTC m=+0.052903177 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:19:05 compute-1 nova_compute[187157]: 2025-12-03 00:19:05.309 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:05 compute-1 nova_compute[187157]: 2025-12-03 00:19:05.483 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:05 compute-1 nova_compute[187157]: 2025-12-03 00:19:05.486 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:05 compute-1 podman[197537]: time="2025-12-03T00:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:19:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:19:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2617 "" "Go-http-client/1.1"
Dec 03 00:19:05 compute-1 nova_compute[187157]: 2025-12-03 00:19:05.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:08 compute-1 podman[218937]: 2025-12-03 00:19:08.22693269 +0000 UTC m=+0.065730528 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 03 00:19:08 compute-1 podman[218938]: 2025-12-03 00:19:08.253272969 +0000 UTC m=+0.096838354 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:19:08 compute-1 nova_compute[187157]: 2025-12-03 00:19:08.713 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:10 compute-1 nova_compute[187157]: 2025-12-03 00:19:10.016 187161 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c9a442a2-b67f-45a9-a7b3-2f866d137327',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:19:10 compute-1 nova_compute[187157]: 2025-12-03 00:19:10.311 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:11 compute-1 nova_compute[187157]: 2025-12-03 00:19:11.230 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:19:11 compute-1 nova_compute[187157]: 2025-12-03 00:19:11.231 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:19:11 compute-1 nova_compute[187157]: 2025-12-03 00:19:11.231 187161 DEBUG nova.network.neutron [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:19:11 compute-1 nova_compute[187157]: 2025-12-03 00:19:11.910 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:12 compute-1 nova_compute[187157]: 2025-12-03 00:19:12.399 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:12 compute-1 nova_compute[187157]: 2025-12-03 00:19:12.711 187161 DEBUG nova.network.neutron [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:19:13 compute-1 nova_compute[187157]: 2025-12-03 00:19:13.715 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.077 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.098 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c9a442a2-b67f-45a9-a7b3-2f866d137327',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.099 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Creating instance directory: /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.099 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Creating disk.info with the contents: {'/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk': 'qcow2', '/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.099 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.100 187161 DEBUG nova.objects.instance [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid c9a442a2-b67f-45a9-a7b3-2f866d137327 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.606 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.609 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.610 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.663 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.664 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.665 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.666 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.670 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.671 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.721 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.723 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.751 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.752 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.753 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.802 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.803 187161 DEBUG nova.virt.disk.api [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.803 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.855 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.856 187161 DEBUG nova.virt.disk.api [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:19:14 compute-1 nova_compute[187157]: 2025-12-03 00:19:14.856 187161 DEBUG nova.objects.instance [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid c9a442a2-b67f-45a9-a7b3-2f866d137327 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.313 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.371 187161 DEBUG nova.objects.base [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<c9a442a2-b67f-45a9-a7b3-2f866d137327> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.372 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.411 187161 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config 497664" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.413 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.415 187161 DEBUG nova.virt.libvirt.vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:17:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1409069',id=26,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-82faa1lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:18:06Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.416 187161 DEBUG nova.network.os_vif_util [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.417 187161 DEBUG nova.network.os_vif_util [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.418 187161 DEBUG os_vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.420 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.421 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.421 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.423 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.424 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0600d7a0-8dfb-590e-af55-2c595eba2741', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.426 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.428 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.432 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc926feac-0f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.433 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc926feac-0f, col_values=(('qos', UUID('f9e8be79-1d8c-4515-b700-4e70f245156d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.433 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc926feac-0f, col_values=(('external_ids', {'iface-id': 'c926feac-0f5a-4138-a74f-f066c3bf5f80', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:4e:2a', 'vm-uuid': 'c9a442a2-b67f-45a9-a7b3-2f866d137327'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.436 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 NetworkManager[55553]: <info>  [1764721155.4370] manager: (tapc926feac-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.440 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.446 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.447 187161 INFO os_vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f')
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.447 187161 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.448 187161 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c9a442a2-b67f-45a9-a7b3-2f866d137327',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.448 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:15 compute-1 nova_compute[187157]: 2025-12-03 00:19:15.926 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:17 compute-1 nova_compute[187157]: 2025-12-03 00:19:17.063 187161 DEBUG nova.network.neutron [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Port c926feac-0f5a-4138-a74f-f066c3bf5f80 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:19:17 compute-1 nova_compute[187157]: 2025-12-03 00:19:17.075 187161 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c9a442a2-b67f-45a9-a7b3-2f866d137327',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:19:18 compute-1 nova_compute[187157]: 2025-12-03 00:19:18.752 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: ERROR   00:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: ERROR   00:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: ERROR   00:19:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: ERROR   00:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: ERROR   00:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:19:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:19:19 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:19:19 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:19:20 compute-1 kernel: tapc926feac-0f: entered promiscuous mode
Dec 03 00:19:20 compute-1 ovn_controller[95464]: 2025-12-03T00:19:20Z|00230|binding|INFO|Claiming lport c926feac-0f5a-4138-a74f-f066c3bf5f80 for this additional chassis.
Dec 03 00:19:20 compute-1 ovn_controller[95464]: 2025-12-03T00:19:20Z|00231|binding|INFO|c926feac-0f5a-4138-a74f-f066c3bf5f80: Claiming fa:16:3e:91:4e:2a 10.100.0.9
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.068 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 NetworkManager[55553]: <info>  [1764721160.0706] manager: (tapc926feac-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.075 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.080 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:4e:2a 10.100.0.9'], port_security=['fa:16:3e:91:4e:2a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c9a442a2-b67f-45a9-a7b3-2f866d137327', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c926feac-0f5a-4138-a74f-f066c3bf5f80) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.081 104348 INFO neutron.agent.ovn.metadata.agent [-] Port c926feac-0f5a-4138-a74f-f066c3bf5f80 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.082 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.093 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[abae2e2d-8d3c-4265-bae6-007b6c200c19]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.094 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7ff943d-e1 in ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.096 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7ff943d-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.096 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e85144d4-4e14-4807-af4d-c100b203a755]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.096 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbad107-3f81-44b5-a658-8f96a226f90c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 systemd-udevd[219036]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.107 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[f37e1c1c-66a3-4db6-af26-288f4f4c7672]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 systemd-machined[153454]: New machine qemu-21-instance-0000001a.
Dec 03 00:19:20 compute-1 NetworkManager[55553]: <info>  [1764721160.1132] device (tapc926feac-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:19:20 compute-1 NetworkManager[55553]: <info>  [1764721160.1141] device (tapc926feac-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.126 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.126 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[96e37f68-0579-419b-bf1d-a58e1fbe2d95]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-0000001a.
Dec 03 00:19:20 compute-1 ovn_controller[95464]: 2025-12-03T00:19:20Z|00232|binding|INFO|Setting lport c926feac-0f5a-4138-a74f-f066c3bf5f80 ovn-installed in OVS
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.137 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.138 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.150 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce04a30-3061-4074-a19b-a24cf567cd6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.154 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[01e34d98-f6ed-4f5e-b2b9-1dff6c284547]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 NetworkManager[55553]: <info>  [1764721160.1557] manager: (tapf7ff943d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.182 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[d8324e76-1d0f-4476-8b7e-684931e22ef5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.185 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[31a3b4dd-0bbd-4ce5-b686-fe05deeaf0d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 NetworkManager[55553]: <info>  [1764721160.2058] device (tapf7ff943d-e0): carrier: link connected
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.211 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[33c4093e-eba2-4611-82d7-3c32c8c18eeb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.225 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[45a1623a-7b85-473e-a3d5-799d61546804]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517986, 'reachable_time': 32705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219069, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.240 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fc816bb7-660e-47e8-92ca-a25f0bf4e7e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:9625'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517986, 'tstamp': 517986}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219070, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.256 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[842d159c-e057-48d9-8b60-4948fdf8c078]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517986, 'reachable_time': 32705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219071, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.283 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[21d3f977-fbd7-4923-aa20-32012985bf4a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.348 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a17664a8-de48-4760-8062-28664b01a8c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.349 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.349 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.350 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.351 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 kernel: tapf7ff943d-e0: entered promiscuous mode
Dec 03 00:19:20 compute-1 NetworkManager[55553]: <info>  [1764721160.3525] manager: (tapf7ff943d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.354 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.355 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.356 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 ovn_controller[95464]: 2025-12-03T00:19:20Z|00233|binding|INFO|Releasing lport 636cd919-869d-4a8a-92fa-ec7c18804da5 from this chassis (sb_readonly=0)
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.379 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.381 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6fbfbf2c-c2b2-4a3f-8619-2f6c571b9434]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.382 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.383 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.383 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.383 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.383 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d1297d41-04fa-4d26-9314-d812468fa386]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.384 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.385 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[19a79451-464b-497f-a57c-8bc8f2d0363c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.386 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:19:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:20.386 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'env', 'PROCESS_TAG=haproxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:19:20 compute-1 nova_compute[187157]: 2025-12-03 00:19:20.436 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:20 compute-1 podman[219111]: 2025-12-03 00:19:20.755289454 +0000 UTC m=+0.046425729 container create 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true)
Dec 03 00:19:20 compute-1 systemd[1]: Started libpod-conmon-3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4.scope.
Dec 03 00:19:20 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:19:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7dcb9cbf9af8f4efff09f3f9957fcb5b59b9e4423c6e5daec4d261cd304c502/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:19:20 compute-1 podman[219111]: 2025-12-03 00:19:20.733438583 +0000 UTC m=+0.024574888 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:19:20 compute-1 podman[219111]: 2025-12-03 00:19:20.843445276 +0000 UTC m=+0.134581641 container init 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:19:20 compute-1 podman[219111]: 2025-12-03 00:19:20.849818171 +0000 UTC m=+0.140954446 container start 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:19:20 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [NOTICE]   (219131) : New worker (219133) forked
Dec 03 00:19:20 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [NOTICE]   (219131) : Loading success.
Dec 03 00:19:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:21.987 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:19:21 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:21.988 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:19:22 compute-1 nova_compute[187157]: 2025-12-03 00:19:22.030 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-1 ovn_controller[95464]: 2025-12-03T00:19:22Z|00234|binding|INFO|Claiming lport c926feac-0f5a-4138-a74f-f066c3bf5f80 for this chassis.
Dec 03 00:19:22 compute-1 ovn_controller[95464]: 2025-12-03T00:19:22Z|00235|binding|INFO|c926feac-0f5a-4138-a74f-f066c3bf5f80: Claiming fa:16:3e:91:4e:2a 10.100.0.9
Dec 03 00:19:22 compute-1 ovn_controller[95464]: 2025-12-03T00:19:22Z|00236|binding|INFO|Setting lport c926feac-0f5a-4138-a74f-f066c3bf5f80 up in Southbound
Dec 03 00:19:23 compute-1 podman[219151]: 2025-12-03 00:19:23.254449532 +0000 UTC m=+0.084383071 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.345 187161 INFO nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Post operation of migration started
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.345 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.753 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.899 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.899 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.977 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.977 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:19:23 compute-1 nova_compute[187157]: 2025-12-03 00:19:23.978 187161 DEBUG nova.network.neutron [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:19:23 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:23.989 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:24 compute-1 nova_compute[187157]: 2025-12-03 00:19:24.484 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:25 compute-1 nova_compute[187157]: 2025-12-03 00:19:25.202 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:25 compute-1 nova_compute[187157]: 2025-12-03 00:19:25.438 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:25 compute-1 nova_compute[187157]: 2025-12-03 00:19:25.451 187161 DEBUG nova.network.neutron [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:19:25 compute-1 nova_compute[187157]: 2025-12-03 00:19:25.967 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:19:26 compute-1 podman[219172]: 2025-12-03 00:19:26.2083243 +0000 UTC m=+0.055371015 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:19:26 compute-1 nova_compute[187157]: 2025-12-03 00:19:26.489 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:26 compute-1 nova_compute[187157]: 2025-12-03 00:19:26.489 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:26 compute-1 nova_compute[187157]: 2025-12-03 00:19:26.490 187161 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:26 compute-1 nova_compute[187157]: 2025-12-03 00:19:26.493 187161 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:19:26 compute-1 virtqemud[186882]: Domain id=21 name='instance-0000001a' uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327 is tainted: custom-monitor
Dec 03 00:19:27 compute-1 nova_compute[187157]: 2025-12-03 00:19:27.501 187161 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:19:27 compute-1 sshd-session[219192]: Invalid user sol from 45.148.10.240 port 52294
Dec 03 00:19:27 compute-1 sshd-session[219192]: Connection closed by invalid user sol 45.148.10.240 port 52294 [preauth]
Dec 03 00:19:28 compute-1 nova_compute[187157]: 2025-12-03 00:19:28.507 187161 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:19:28 compute-1 nova_compute[187157]: 2025-12-03 00:19:28.513 187161 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:19:28 compute-1 nova_compute[187157]: 2025-12-03 00:19:28.756 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:29 compute-1 nova_compute[187157]: 2025-12-03 00:19:29.025 187161 DEBUG nova.objects.instance [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:19:30 compute-1 nova_compute[187157]: 2025-12-03 00:19:30.045 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:30 compute-1 nova_compute[187157]: 2025-12-03 00:19:30.440 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:30 compute-1 nova_compute[187157]: 2025-12-03 00:19:30.934 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:30 compute-1 nova_compute[187157]: 2025-12-03 00:19:30.934 187161 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:33 compute-1 nova_compute[187157]: 2025-12-03 00:19:33.800 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:35 compute-1 nova_compute[187157]: 2025-12-03 00:19:35.441 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:35 compute-1 podman[197537]: time="2025-12-03T00:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:19:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:19:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3081 "" "Go-http-client/1.1"
Dec 03 00:19:36 compute-1 podman[219195]: 2025-12-03 00:19:36.235974419 +0000 UTC m=+0.071261502 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:19:38 compute-1 nova_compute[187157]: 2025-12-03 00:19:38.847 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:39 compute-1 nova_compute[187157]: 2025-12-03 00:19:39.160 187161 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e3ecd0e-4de1-44c9-805b-8d695da6b95e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:19:39 compute-1 podman[219219]: 2025-12-03 00:19:39.220503223 +0000 UTC m=+0.051744808 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 03 00:19:39 compute-1 podman[219220]: 2025-12-03 00:19:39.244746982 +0000 UTC m=+0.077971676 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:19:40 compute-1 nova_compute[187157]: 2025-12-03 00:19:40.346 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:19:40 compute-1 nova_compute[187157]: 2025-12-03 00:19:40.347 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:19:40 compute-1 nova_compute[187157]: 2025-12-03 00:19:40.347 187161 DEBUG nova.network.neutron [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:19:40 compute-1 nova_compute[187157]: 2025-12-03 00:19:40.442 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:41 compute-1 nova_compute[187157]: 2025-12-03 00:19:41.167 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:42 compute-1 nova_compute[187157]: 2025-12-03 00:19:42.993 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.209 187161 DEBUG nova.network.neutron [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.717 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.729 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e3ecd0e-4de1-44c9-805b-8d695da6b95e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.729 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Creating instance directory: /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.730 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Creating disk.info with the contents: {'/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk': 'qcow2', '/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.730 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.730 187161 DEBUG nova.objects.instance [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2e3ecd0e-4de1-44c9-805b-8d695da6b95e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:19:43 compute-1 nova_compute[187157]: 2025-12-03 00:19:43.891 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.237 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.240 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.241 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.291 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.292 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.292 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.293 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.295 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.296 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.346 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.346 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.374 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.375 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.375 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.425 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.426 187161 DEBUG nova.virt.disk.api [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.426 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.479 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.480 187161 DEBUG nova.virt.disk.api [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.480 187161 DEBUG nova.objects.instance [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 2e3ecd0e-4de1-44c9-805b-8d695da6b95e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.988 187161 DEBUG nova.objects.base [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<2e3ecd0e-4de1-44c9-805b-8d695da6b95e> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:19:44 compute-1 nova_compute[187157]: 2025-12-03 00:19:44.989 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.009 187161 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config 497664" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.010 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.011 187161 DEBUG nova.virt.libvirt.vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:18:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1045897',id=27,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-78r0m2qy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:18:33Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.011 187161 DEBUG nova.network.os_vif_util [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.013 187161 DEBUG nova.network.os_vif_util [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.013 187161 DEBUG os_vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.013 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.014 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.014 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.015 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.015 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9f5fee56-14ae-5e8c-9c1b-69c88a6bd4cb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.060 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.062 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.064 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.064 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b0adcad-4e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.064 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8b0adcad-4e, col_values=(('qos', UUID('0ae6a953-601b-420d-90d6-3c0a4495e0ba')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.064 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8b0adcad-4e, col_values=(('external_ids', {'iface-id': '8b0adcad-4e57-4150-b6d7-890ceb893e2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:ff:2c', 'vm-uuid': '2e3ecd0e-4de1-44c9-805b-8d695da6b95e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.065 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-1 NetworkManager[55553]: <info>  [1764721185.0672] manager: (tap8b0adcad-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.068 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.074 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.075 187161 INFO os_vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e')
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.076 187161 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.076 187161 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e3ecd0e-4de1-44c9-805b-8d695da6b95e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.076 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:45 compute-1 nova_compute[187157]: 2025-12-03 00:19:45.927 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:46 compute-1 nova_compute[187157]: 2025-12-03 00:19:46.526 187161 DEBUG nova.network.neutron [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Port 8b0adcad-4e57-4150-b6d7-890ceb893e2e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:19:46 compute-1 nova_compute[187157]: 2025-12-03 00:19:46.543 187161 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e3ecd0e-4de1-44c9-805b-8d695da6b95e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:19:47 compute-1 nova_compute[187157]: 2025-12-03 00:19:47.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:48 compute-1 nova_compute[187157]: 2025-12-03 00:19:48.893 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: ERROR   00:19:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: ERROR   00:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: ERROR   00:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: ERROR   00:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: ERROR   00:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:19:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:19:49 compute-1 nova_compute[187157]: 2025-12-03 00:19:49.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.065 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:50 compute-1 kernel: tap8b0adcad-4e: entered promiscuous mode
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.263 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:50 compute-1 ovn_controller[95464]: 2025-12-03T00:19:50Z|00237|binding|INFO|Claiming lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e for this additional chassis.
Dec 03 00:19:50 compute-1 NetworkManager[55553]: <info>  [1764721190.2653] manager: (tap8b0adcad-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Dec 03 00:19:50 compute-1 ovn_controller[95464]: 2025-12-03T00:19:50Z|00238|binding|INFO|8b0adcad-4e57-4150-b6d7-890ceb893e2e: Claiming fa:16:3e:9d:ff:2c 10.100.0.10
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.274 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:ff:2c 10.100.0.10'], port_security=['fa:16:3e:9d:ff:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2e3ecd0e-4de1-44c9-805b-8d695da6b95e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=8b0adcad-4e57-4150-b6d7-890ceb893e2e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.275 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 8b0adcad-4e57-4150-b6d7-890ceb893e2e in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.276 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:19:50 compute-1 ovn_controller[95464]: 2025-12-03T00:19:50Z|00239|binding|INFO|Setting lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e ovn-installed in OVS
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.278 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.281 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.291 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[29bbcdaa-75ef-48b1-be06-2ef8e53e96b8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 systemd-udevd[219297]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:19:50 compute-1 systemd-machined[153454]: New machine qemu-22-instance-0000001b.
Dec 03 00:19:50 compute-1 NetworkManager[55553]: <info>  [1764721190.3062] device (tap8b0adcad-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:19:50 compute-1 NetworkManager[55553]: <info>  [1764721190.3075] device (tap8b0adcad-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.320 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[5633cf7a-d0b9-4ace-bdf5-1ffa6f4be2ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-0000001b.
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.322 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[0f08c8f3-2723-44f0-aea5-fc362eb26439]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.346 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[1d528bd9-023a-49eb-978f-0e57f867441a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.360 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[43e83835-fdbc-47f5-873d-a48d08a71f0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517986, 'reachable_time': 32705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219309, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.375 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7836e835-a0b5-4c2a-8166-4885acbcfcba]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517996, 'tstamp': 517996}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219311, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517999, 'tstamp': 517999}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219311, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.376 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.378 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.378 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.379 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.379 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.379 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.380 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:50 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:19:50.381 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f78381bb-d9d9-4a67-bf67-772e866ed54a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:50 compute-1 nova_compute[187157]: 2025-12-03 00:19:50.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:51 compute-1 nova_compute[187157]: 2025-12-03 00:19:51.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:52 compute-1 nova_compute[187157]: 2025-12-03 00:19:52.210 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:52 compute-1 nova_compute[187157]: 2025-12-03 00:19:52.210 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:52 compute-1 nova_compute[187157]: 2025-12-03 00:19:52.210 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:52 compute-1 nova_compute[187157]: 2025-12-03 00:19:52.210 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:19:53 compute-1 ovn_controller[95464]: 2025-12-03T00:19:53Z|00240|binding|INFO|Claiming lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e for this chassis.
Dec 03 00:19:53 compute-1 ovn_controller[95464]: 2025-12-03T00:19:53Z|00241|binding|INFO|8b0adcad-4e57-4150-b6d7-890ceb893e2e: Claiming fa:16:3e:9d:ff:2c 10.100.0.10
Dec 03 00:19:53 compute-1 ovn_controller[95464]: 2025-12-03T00:19:53Z|00242|binding|INFO|Setting lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e up in Southbound
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.255 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.310 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.310 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.361 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.366 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.417 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.418 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.467 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.615 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.616 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.635 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.636 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5513MB free_disk=73.10697555541992GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.636 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.637 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:53 compute-1 nova_compute[187157]: 2025-12-03 00:19:53.895 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:54 compute-1 podman[219343]: 2025-12-03 00:19:54.238658809 +0000 UTC m=+0.067376018 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.067 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.166 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 2e3ecd0e-4de1-44c9-805b-8d695da6b95e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.678 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating resource usage from migration d693345a-fa5e-4845-a60d-9331bd660235
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.679 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Starting to track incoming migration d693345a-fa5e-4845-a60d-9331bd660235 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.784 187161 INFO nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Post operation of migration started
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.786 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.961 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:55 compute-1 nova_compute[187157]: 2025-12-03 00:19:55.962 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.053 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.053 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.053 187161 DEBUG nova.network.neutron [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance c9a442a2-b67f-45a9-a7b3-2f866d137327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.568 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.723 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 2e3ecd0e-4de1-44c9-805b-8d695da6b95e has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.724 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.724 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:19:53 up  1:26,  0 user,  load average: 0.06, 0.15, 0.26\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:19:56 compute-1 nova_compute[187157]: 2025-12-03 00:19:56.767 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:19:57 compute-1 podman[219365]: 2025-12-03 00:19:57.216717415 +0000 UTC m=+0.059966099 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 03 00:19:57 compute-1 nova_compute[187157]: 2025-12-03 00:19:57.287 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:19:57 compute-1 nova_compute[187157]: 2025-12-03 00:19:57.800 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:19:57 compute-1 nova_compute[187157]: 2025-12-03 00:19:57.800 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.163s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:58 compute-1 nova_compute[187157]: 2025-12-03 00:19:58.780 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:58 compute-1 nova_compute[187157]: 2025-12-03 00:19:58.899 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:58 compute-1 nova_compute[187157]: 2025-12-03 00:19:58.983 187161 DEBUG nova.network.neutron [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:19:59 compute-1 nova_compute[187157]: 2025-12-03 00:19:59.490 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:20:00 compute-1 nova_compute[187157]: 2025-12-03 00:20:00.010 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:00 compute-1 nova_compute[187157]: 2025-12-03 00:20:00.011 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:00 compute-1 nova_compute[187157]: 2025-12-03 00:20:00.011 187161 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:00 compute-1 nova_compute[187157]: 2025-12-03 00:20:00.015 187161 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:20:00 compute-1 virtqemud[186882]: Domain id=22 name='instance-0000001b' uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e is tainted: custom-monitor
Dec 03 00:20:00 compute-1 nova_compute[187157]: 2025-12-03 00:20:00.069 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:01 compute-1 nova_compute[187157]: 2025-12-03 00:20:01.020 187161 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:20:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:01.746 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:01.746 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:01.747 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:02 compute-1 nova_compute[187157]: 2025-12-03 00:20:02.025 187161 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:20:02 compute-1 nova_compute[187157]: 2025-12-03 00:20:02.031 187161 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:20:02 compute-1 nova_compute[187157]: 2025-12-03 00:20:02.551 187161 DEBUG nova.objects.instance [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:20:03 compute-1 nova_compute[187157]: 2025-12-03 00:20:03.901 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.025 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.304 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.306 187161 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.795 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.795 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.795 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.796 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:04 compute-1 nova_compute[187157]: 2025-12-03 00:20:04.796 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:20:05 compute-1 nova_compute[187157]: 2025-12-03 00:20:05.070 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:05 compute-1 podman[197537]: time="2025-12-03T00:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:20:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:20:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3081 "" "Go-http-client/1.1"
Dec 03 00:20:07 compute-1 podman[219387]: 2025-12-03 00:20:07.204264588 +0000 UTC m=+0.049919395 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.901 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.902 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.902 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.903 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.903 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.904 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:08 compute-1 nova_compute[187157]: 2025-12-03 00:20:08.915 187161 INFO nova.compute.manager [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Terminating instance
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.431 187161 DEBUG nova.compute.manager [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:20:09 compute-1 kernel: tap8b0adcad-4e (unregistering): left promiscuous mode
Dec 03 00:20:09 compute-1 NetworkManager[55553]: <info>  [1764721209.4592] device (tap8b0adcad-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:20:09 compute-1 ovn_controller[95464]: 2025-12-03T00:20:09Z|00243|binding|INFO|Releasing lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e from this chassis (sb_readonly=0)
Dec 03 00:20:09 compute-1 ovn_controller[95464]: 2025-12-03T00:20:09Z|00244|binding|INFO|Setting lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e down in Southbound
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.468 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-1 ovn_controller[95464]: 2025-12-03T00:20:09Z|00245|binding|INFO|Removing iface tap8b0adcad-4e ovn-installed in OVS
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.470 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.476 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:ff:2c 10.100.0.10'], port_security=['fa:16:3e:9d:ff:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2e3ecd0e-4de1-44c9-805b-8d695da6b95e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=8b0adcad-4e57-4150-b6d7-890ceb893e2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.476 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 8b0adcad-4e57-4150-b6d7-890ceb893e2e in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.478 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.491 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[39b94f5e-6b33-4f5a-82e3-0796a2a604dd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.500 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec 03 00:20:09 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001b.scope: Consumed 2.155s CPU time.
Dec 03 00:20:09 compute-1 systemd-machined[153454]: Machine qemu-22-instance-0000001b terminated.
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.515 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[b7b65bb3-ef9b-4d00-a8b1-7e3b1b2668a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.518 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[cefa64e8-b5a7-489a-aed8-7b647cecf186]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.546 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[1da28802-7561-4d30-9272-457654ae4665]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 podman[219413]: 2025-12-03 00:20:09.564242294 +0000 UTC m=+0.057440237 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.567 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bed98950-e28c-46fe-bd4d-2369edd93198]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517986, 'reachable_time': 32705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219451, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.581 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1410e10b-4b35-48e6-8145-a4251798dfc3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517996, 'tstamp': 517996}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219459, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517999, 'tstamp': 517999}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219459, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.583 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.584 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.590 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.591 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.591 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.591 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.591 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:20:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:09.593 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[f58c3c9f-36b0-4d23-8c57-2471366ff72e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.605 187161 DEBUG nova.compute.manager [req-10421f3d-a283-4fb0-844d-dbb26a1dc050 req-dc83cd42-b653-4d72-8e58-a59bb7ce6084 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.606 187161 DEBUG oslo_concurrency.lockutils [req-10421f3d-a283-4fb0-844d-dbb26a1dc050 req-dc83cd42-b653-4d72-8e58-a59bb7ce6084 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.606 187161 DEBUG oslo_concurrency.lockutils [req-10421f3d-a283-4fb0-844d-dbb26a1dc050 req-dc83cd42-b653-4d72-8e58-a59bb7ce6084 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.606 187161 DEBUG oslo_concurrency.lockutils [req-10421f3d-a283-4fb0-844d-dbb26a1dc050 req-dc83cd42-b653-4d72-8e58-a59bb7ce6084 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.606 187161 DEBUG nova.compute.manager [req-10421f3d-a283-4fb0-844d-dbb26a1dc050 req-dc83cd42-b653-4d72-8e58-a59bb7ce6084 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.607 187161 DEBUG nova.compute.manager [req-10421f3d-a283-4fb0-844d-dbb26a1dc050 req-dc83cd42-b653-4d72-8e58-a59bb7ce6084 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:20:09 compute-1 podman[219417]: 2025-12-03 00:20:09.608255713 +0000 UTC m=+0.098814732 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:20:09 compute-1 kernel: tap8b0adcad-4e: entered promiscuous mode
Dec 03 00:20:09 compute-1 kernel: tap8b0adcad-4e (unregistering): left promiscuous mode
Dec 03 00:20:09 compute-1 NetworkManager[55553]: <info>  [1764721209.6528] manager: (tap8b0adcad-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.656 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.695 187161 INFO nova.virt.libvirt.driver [-] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance destroyed successfully.
Dec 03 00:20:09 compute-1 nova_compute[187157]: 2025-12-03 00:20:09.695 187161 DEBUG nova.objects.instance [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'resources' on Instance uuid 2e3ecd0e-4de1-44c9-805b-8d695da6b95e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.073 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.202 187161 DEBUG nova.virt.libvirt.vif [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:18:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1045897',id=27,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-78r0m2qy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:20:03Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.202 187161 DEBUG nova.network.os_vif_util [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.203 187161 DEBUG nova.network.os_vif_util [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.203 187161 DEBUG os_vif [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.204 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.205 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b0adcad-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.207 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.208 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.208 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0ae6a953-601b-420d-90d6-3c0a4495e0ba) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.209 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.210 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.212 187161 INFO os_vif [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e')
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.212 187161 INFO nova.virt.libvirt.driver [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Deleting instance files /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e_del
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.213 187161 INFO nova.virt.libvirt.driver [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Deletion of /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e_del complete
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.724 187161 INFO nova.compute.manager [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.724 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.724 187161 DEBUG nova.compute.manager [-] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.724 187161 DEBUG nova.network.neutron [-] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.725 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:10 compute-1 nova_compute[187157]: 2025-12-03 00:20:10.924 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.662 187161 DEBUG nova.compute.manager [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.662 187161 DEBUG oslo_concurrency.lockutils [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.662 187161 DEBUG oslo_concurrency.lockutils [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.662 187161 DEBUG oslo_concurrency.lockutils [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.663 187161 DEBUG nova.compute.manager [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.663 187161 DEBUG nova.compute.manager [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.663 187161 DEBUG nova.compute.manager [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-deleted-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.663 187161 INFO nova.compute.manager [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Neutron deleted interface 8b0adcad-4e57-4150-b6d7-890ceb893e2e; detaching it from the instance and deleting it from the info cache
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.664 187161 DEBUG nova.network.neutron [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:11 compute-1 nova_compute[187157]: 2025-12-03 00:20:11.691 187161 DEBUG nova.network.neutron [-] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:12 compute-1 nova_compute[187157]: 2025-12-03 00:20:12.170 187161 DEBUG nova.compute.manager [req-0b5c39c1-2fbd-4cae-8a2e-862a33702fa5 req-f66bd638-fef5-432d-a2f4-61a3657793e8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Detach interface failed, port_id=8b0adcad-4e57-4150-b6d7-890ceb893e2e, reason: Instance 2e3ecd0e-4de1-44c9-805b-8d695da6b95e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:20:12 compute-1 nova_compute[187157]: 2025-12-03 00:20:12.195 187161 INFO nova.compute.manager [-] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Took 1.47 seconds to deallocate network for instance.
Dec 03 00:20:12 compute-1 nova_compute[187157]: 2025-12-03 00:20:12.715 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:12 compute-1 nova_compute[187157]: 2025-12-03 00:20:12.715 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:12 compute-1 nova_compute[187157]: 2025-12-03 00:20:12.720 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:12 compute-1 nova_compute[187157]: 2025-12-03 00:20:12.750 187161 INFO nova.scheduler.client.report [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Deleted allocations for instance 2e3ecd0e-4de1-44c9-805b-8d695da6b95e
Dec 03 00:20:13 compute-1 nova_compute[187157]: 2025-12-03 00:20:13.972 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:14 compute-1 nova_compute[187157]: 2025-12-03 00:20:14.500 187161 DEBUG oslo_concurrency.lockutils [None req-88a9ff46-0156-46e9-be26-926dd3c2a0f9 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.598s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.173 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.174 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.174 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.175 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.175 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.187 187161 INFO nova.compute.manager [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Terminating instance
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.236 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.703 187161 DEBUG nova.compute.manager [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:20:15 compute-1 kernel: tapc926feac-0f (unregistering): left promiscuous mode
Dec 03 00:20:15 compute-1 NetworkManager[55553]: <info>  [1764721215.7343] device (tapc926feac-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:20:15 compute-1 ovn_controller[95464]: 2025-12-03T00:20:15Z|00246|binding|INFO|Releasing lport c926feac-0f5a-4138-a74f-f066c3bf5f80 from this chassis (sb_readonly=0)
Dec 03 00:20:15 compute-1 ovn_controller[95464]: 2025-12-03T00:20:15Z|00247|binding|INFO|Setting lport c926feac-0f5a-4138-a74f-f066c3bf5f80 down in Southbound
Dec 03 00:20:15 compute-1 ovn_controller[95464]: 2025-12-03T00:20:15Z|00248|binding|INFO|Removing iface tapc926feac-0f ovn-installed in OVS
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.739 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.754 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:15 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec 03 00:20:15 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Consumed 3.559s CPU time.
Dec 03 00:20:15 compute-1 systemd-machined[153454]: Machine qemu-21-instance-0000001a terminated.
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.922 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.927 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.961 187161 INFO nova.virt.libvirt.driver [-] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance destroyed successfully.
Dec 03 00:20:15 compute-1 nova_compute[187157]: 2025-12-03 00:20:15.961 187161 DEBUG nova.objects.instance [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'resources' on Instance uuid c9a442a2-b67f-45a9-a7b3-2f866d137327 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.000 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:4e:2a 10.100.0.9'], port_security=['fa:16:3e:91:4e:2a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c9a442a2-b67f-45a9-a7b3-2f866d137327', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=c926feac-0f5a-4138-a74f-f066c3bf5f80) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.001 104348 INFO neutron.agent.ovn.metadata.agent [-] Port c926feac-0f5a-4138-a74f-f066c3bf5f80 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.002 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.003 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2df04a87-30ba-4ffb-b929-1b5de9359b81]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.003 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace which is not needed anymore
Dec 03 00:20:16 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [NOTICE]   (219131) : haproxy version is 3.0.5-8e879a5
Dec 03 00:20:16 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [NOTICE]   (219131) : path to executable is /usr/sbin/haproxy
Dec 03 00:20:16 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [WARNING]  (219131) : Exiting Master process...
Dec 03 00:20:16 compute-1 podman[219519]: 2025-12-03 00:20:16.101192238 +0000 UTC m=+0.026488445 container kill 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:20:16 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [ALERT]    (219131) : Current worker (219133) exited with code 143 (Terminated)
Dec 03 00:20:16 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219127]: [WARNING]  (219131) : All workers exited. Exiting... (0)
Dec 03 00:20:16 compute-1 systemd[1]: libpod-3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4.scope: Deactivated successfully.
Dec 03 00:20:16 compute-1 podman[219535]: 2025-12-03 00:20:16.141320863 +0000 UTC m=+0.023388489 container died 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202)
Dec 03 00:20:16 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4-userdata-shm.mount: Deactivated successfully.
Dec 03 00:20:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-e7dcb9cbf9af8f4efff09f3f9957fcb5b59b9e4423c6e5daec4d261cd304c502-merged.mount: Deactivated successfully.
Dec 03 00:20:16 compute-1 podman[219535]: 2025-12-03 00:20:16.168803532 +0000 UTC m=+0.050871138 container cleanup 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 03 00:20:16 compute-1 systemd[1]: libpod-conmon-3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4.scope: Deactivated successfully.
Dec 03 00:20:16 compute-1 podman[219537]: 2025-12-03 00:20:16.188724765 +0000 UTC m=+0.062058469 container remove 3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.193 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[78b7ce90-787e-477f-84f7-e189e9fa928d]: (4, ("Wed Dec  3 12:20:16 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4)\n3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4\nWed Dec  3 12:20:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4)\n3dd9c6e7455b66882299e7625e7da6d5b025d42d4860c3ebd9bf2d0210fe75b4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.195 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[dc519b52-ed48-47a3-baa4-f90a8275c080]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.195 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.195 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4bfe16a4-cd73-4d55-aed6-721cf0fb9d5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.196 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.197 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:16 compute-1 kernel: tapf7ff943d-e0: left promiscuous mode
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.211 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.216 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ee4518-e3ac-4d75-bf7f-e73b79d5cd5b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.234 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[66479a73-1724-42aa-af48-ed968f3514dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.235 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e56a2a4d-9942-42f3-902e-00f28bee5d9a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.248 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[83f9b815-df00-4cee-bf8c-ac6040e91c26]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517980, 'reachable_time': 40763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219571, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 systemd[1]: run-netns-ovnmeta\x2df7ff943d\x2de57d\x2d4bc2\x2d8dd6\x2df8a8bb6e4f89.mount: Deactivated successfully.
Dec 03 00:20:16 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.251 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:20:16 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 00:20:16 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:16.251 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[9a793145-225f-4512-a7bb-4349bec6807d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.500 187161 DEBUG nova.virt.libvirt.vif [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:17:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1409069',id=26,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-82faa1lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:19:29Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.501 187161 DEBUG nova.network.os_vif_util [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.502 187161 DEBUG nova.network.os_vif_util [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.502 187161 DEBUG os_vif [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.503 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.504 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc926feac-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.507 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.508 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.508 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f9e8be79-1d8c-4515-b700-4e70f245156d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.510 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.512 187161 INFO os_vif [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f')
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.512 187161 INFO nova.virt.libvirt.driver [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Deleting instance files /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327_del
Dec 03 00:20:16 compute-1 nova_compute[187157]: 2025-12-03 00:20:16.513 187161 INFO nova.virt.libvirt.driver [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Deletion of /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327_del complete
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.023 187161 DEBUG nova.compute.manager [req-b10a3750-0b25-4de8-8b0f-04badfa6fbfa req-47686821-e6e7-4f55-ae22-83108975c4f9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.024 187161 DEBUG oslo_concurrency.lockutils [req-b10a3750-0b25-4de8-8b0f-04badfa6fbfa req-47686821-e6e7-4f55-ae22-83108975c4f9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.024 187161 DEBUG oslo_concurrency.lockutils [req-b10a3750-0b25-4de8-8b0f-04badfa6fbfa req-47686821-e6e7-4f55-ae22-83108975c4f9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.024 187161 DEBUG oslo_concurrency.lockutils [req-b10a3750-0b25-4de8-8b0f-04badfa6fbfa req-47686821-e6e7-4f55-ae22-83108975c4f9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.024 187161 DEBUG nova.compute.manager [req-b10a3750-0b25-4de8-8b0f-04badfa6fbfa req-47686821-e6e7-4f55-ae22-83108975c4f9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.024 187161 DEBUG nova.compute.manager [req-b10a3750-0b25-4de8-8b0f-04badfa6fbfa req-47686821-e6e7-4f55-ae22-83108975c4f9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.026 187161 INFO nova.compute.manager [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Took 1.32 seconds to destroy the instance on the hypervisor.
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.027 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.027 187161 DEBUG nova.compute.manager [-] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.027 187161 DEBUG nova.network.neutron [-] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.027 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.124 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.418 187161 DEBUG nova.compute.manager [req-b9a46571-1851-44f1-9fb2-dcc91a03608a req-bd20faa6-c79b-4236-9d26-fcebd56f2538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-deleted-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.418 187161 INFO nova.compute.manager [req-b9a46571-1851-44f1-9fb2-dcc91a03608a req-bd20faa6-c79b-4236-9d26-fcebd56f2538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Neutron deleted interface c926feac-0f5a-4138-a74f-f066c3bf5f80; detaching it from the instance and deleting it from the info cache
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.418 187161 DEBUG nova.network.neutron [req-b9a46571-1851-44f1-9fb2-dcc91a03608a req-bd20faa6-c79b-4236-9d26-fcebd56f2538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.875 187161 DEBUG nova.network.neutron [-] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:17 compute-1 nova_compute[187157]: 2025-12-03 00:20:17.924 187161 DEBUG nova.compute.manager [req-b9a46571-1851-44f1-9fb2-dcc91a03608a req-bd20faa6-c79b-4236-9d26-fcebd56f2538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Detach interface failed, port_id=c926feac-0f5a-4138-a74f-f066c3bf5f80, reason: Instance c9a442a2-b67f-45a9-a7b3-2f866d137327 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:20:18 compute-1 nova_compute[187157]: 2025-12-03 00:20:18.383 187161 INFO nova.compute.manager [-] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Took 1.36 seconds to deallocate network for instance.
Dec 03 00:20:18 compute-1 nova_compute[187157]: 2025-12-03 00:20:18.974 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.078 187161 DEBUG nova.compute.manager [req-e309d727-b9df-4727-b504-5b201ac43731 req-9cfc36fa-8a24-4287-b9e3-da2dc57f4907 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.079 187161 DEBUG oslo_concurrency.lockutils [req-e309d727-b9df-4727-b504-5b201ac43731 req-9cfc36fa-8a24-4287-b9e3-da2dc57f4907 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.079 187161 DEBUG oslo_concurrency.lockutils [req-e309d727-b9df-4727-b504-5b201ac43731 req-9cfc36fa-8a24-4287-b9e3-da2dc57f4907 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.079 187161 DEBUG oslo_concurrency.lockutils [req-e309d727-b9df-4727-b504-5b201ac43731 req-9cfc36fa-8a24-4287-b9e3-da2dc57f4907 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.079 187161 DEBUG nova.compute.manager [req-e309d727-b9df-4727-b504-5b201ac43731 req-9cfc36fa-8a24-4287-b9e3-da2dc57f4907 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.079 187161 WARNING nova.compute.manager [req-e309d727-b9df-4727-b504-5b201ac43731 req-9cfc36fa-8a24-4287-b9e3-da2dc57f4907 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received unexpected event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with vm_state deleted and task_state None.
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.286 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.287 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.348 187161 DEBUG nova.compute.provider_tree [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: ERROR   00:20:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: ERROR   00:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: ERROR   00:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: ERROR   00:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: ERROR   00:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:20:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:20:19 compute-1 nova_compute[187157]: 2025-12-03 00:20:19.856 187161 DEBUG nova.scheduler.client.report [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:20:20 compute-1 nova_compute[187157]: 2025-12-03 00:20:20.369 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:21 compute-1 nova_compute[187157]: 2025-12-03 00:20:21.509 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:21 compute-1 nova_compute[187157]: 2025-12-03 00:20:21.511 187161 INFO nova.scheduler.client.report [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Deleted allocations for instance c9a442a2-b67f-45a9-a7b3-2f866d137327
Dec 03 00:20:22 compute-1 nova_compute[187157]: 2025-12-03 00:20:22.544 187161 DEBUG oslo_concurrency.lockutils [None req-d6b6b03b-b2cf-4842-96f9-7c1d8ef983e8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.371s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:23 compute-1 nova_compute[187157]: 2025-12-03 00:20:23.977 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:25 compute-1 podman[219573]: 2025-12-03 00:20:25.212404277 +0000 UTC m=+0.057052628 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Dec 03 00:20:26 compute-1 nova_compute[187157]: 2025-12-03 00:20:26.511 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:28 compute-1 podman[219594]: 2025-12-03 00:20:28.251369742 +0000 UTC m=+0.082929016 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:20:28 compute-1 nova_compute[187157]: 2025-12-03 00:20:28.979 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:31 compute-1 nova_compute[187157]: 2025-12-03 00:20:31.512 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:34 compute-1 nova_compute[187157]: 2025-12-03 00:20:34.024 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:35 compute-1 podman[197537]: time="2025-12-03T00:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:20:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:20:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2611 "" "Go-http-client/1.1"
Dec 03 00:20:36 compute-1 nova_compute[187157]: 2025-12-03 00:20:36.515 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:38 compute-1 podman[219615]: 2025-12-03 00:20:38.214696368 +0000 UTC m=+0.054524956 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:20:39 compute-1 nova_compute[187157]: 2025-12-03 00:20:39.027 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:40 compute-1 podman[219641]: 2025-12-03 00:20:40.21936158 +0000 UTC m=+0.057991070 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:20:40 compute-1 podman[219642]: 2025-12-03 00:20:40.260350266 +0000 UTC m=+0.094960669 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 03 00:20:41 compute-1 nova_compute[187157]: 2025-12-03 00:20:41.519 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-1 nova_compute[187157]: 2025-12-03 00:20:44.059 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:44.809 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:20:44 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:44.810 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:20:44 compute-1 nova_compute[187157]: 2025-12-03 00:20:44.810 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:46 compute-1 nova_compute[187157]: 2025-12-03 00:20:46.521 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:49 compute-1 nova_compute[187157]: 2025-12-03 00:20:49.094 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: ERROR   00:20:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: ERROR   00:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: ERROR   00:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: ERROR   00:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: ERROR   00:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:20:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:20:49 compute-1 nova_compute[187157]: 2025-12-03 00:20:49.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:50 compute-1 nova_compute[187157]: 2025-12-03 00:20:50.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:51 compute-1 nova_compute[187157]: 2025-12-03 00:20:51.523 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:51 compute-1 nova_compute[187157]: 2025-12-03 00:20:51.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:20:51.811 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:53 compute-1 nova_compute[187157]: 2025-12-03 00:20:53.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.142 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.352 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.353 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.369 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.370 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5820MB free_disk=73.16122055053711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.370 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:54 compute-1 nova_compute[187157]: 2025-12-03 00:20:54.371 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:55 compute-1 nova_compute[187157]: 2025-12-03 00:20:55.412 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:20:55 compute-1 nova_compute[187157]: 2025-12-03 00:20:55.413 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:20:54 up  1:27,  0 user,  load average: 0.09, 0.15, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:20:55 compute-1 nova_compute[187157]: 2025-12-03 00:20:55.442 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:20:55 compute-1 nova_compute[187157]: 2025-12-03 00:20:55.951 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:20:56 compute-1 podman[219690]: 2025-12-03 00:20:56.22213223 +0000 UTC m=+0.058252816 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec 03 00:20:56 compute-1 nova_compute[187157]: 2025-12-03 00:20:56.463 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:20:56 compute-1 nova_compute[187157]: 2025-12-03 00:20:56.463 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:56 compute-1 nova_compute[187157]: 2025-12-03 00:20:56.573 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:59 compute-1 nova_compute[187157]: 2025-12-03 00:20:59.144 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:59 compute-1 podman[219712]: 2025-12-03 00:20:59.221319729 +0000 UTC m=+0.048879799 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 03 00:21:01 compute-1 nova_compute[187157]: 2025-12-03 00:21:01.459 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:01 compute-1 nova_compute[187157]: 2025-12-03 00:21:01.575 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:01 compute-1 nova_compute[187157]: 2025-12-03 00:21:01.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:01.748 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:01.748 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:01.748 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:02 compute-1 nova_compute[187157]: 2025-12-03 00:21:02.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:02 compute-1 nova_compute[187157]: 2025-12-03 00:21:02.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:21:04 compute-1 nova_compute[187157]: 2025-12-03 00:21:04.146 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:05 compute-1 podman[197537]: time="2025-12-03T00:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:21:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:21:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2617 "" "Go-http-client/1.1"
Dec 03 00:21:06 compute-1 nova_compute[187157]: 2025-12-03 00:21:06.576 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:06 compute-1 nova_compute[187157]: 2025-12-03 00:21:06.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:07 compute-1 nova_compute[187157]: 2025-12-03 00:21:07.204 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:09 compute-1 nova_compute[187157]: 2025-12-03 00:21:09.148 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:09 compute-1 podman[219733]: 2025-12-03 00:21:09.19436194 +0000 UTC m=+0.040948737 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:21:11 compute-1 podman[219757]: 2025-12-03 00:21:11.228847898 +0000 UTC m=+0.070033053 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:21:11 compute-1 podman[219758]: 2025-12-03 00:21:11.238380449 +0000 UTC m=+0.078577430 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:21:11 compute-1 nova_compute[187157]: 2025-12-03 00:21:11.577 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:14 compute-1 nova_compute[187157]: 2025-12-03 00:21:14.179 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:14 compute-1 sshd-session[219799]: Received disconnect from 193.46.255.99 port 57756:11:  [preauth]
Dec 03 00:21:14 compute-1 sshd-session[219799]: Disconnected from authenticating user root 193.46.255.99 port 57756 [preauth]
Dec 03 00:21:16 compute-1 nova_compute[187157]: 2025-12-03 00:21:16.580 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:19 compute-1 nova_compute[187157]: 2025-12-03 00:21:19.295 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: ERROR   00:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: ERROR   00:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: ERROR   00:21:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: ERROR   00:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: ERROR   00:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:21:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:21:19 compute-1 sshd-session[219801]: Invalid user sol from 193.32.162.146 port 59830
Dec 03 00:21:19 compute-1 sshd-session[219801]: Connection closed by invalid user sol 193.32.162.146 port 59830 [preauth]
Dec 03 00:21:21 compute-1 nova_compute[187157]: 2025-12-03 00:21:21.582 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:22 compute-1 sshd-session[219803]: Invalid user ubuntu from 45.148.10.240 port 54592
Dec 03 00:21:22 compute-1 sshd-session[219803]: Connection closed by invalid user ubuntu 45.148.10.240 port 54592 [preauth]
Dec 03 00:21:24 compute-1 nova_compute[187157]: 2025-12-03 00:21:24.296 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:26 compute-1 nova_compute[187157]: 2025-12-03 00:21:26.583 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:27 compute-1 podman[219805]: 2025-12-03 00:21:27.214192046 +0000 UTC m=+0.057084568 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec 03 00:21:29 compute-1 nova_compute[187157]: 2025-12-03 00:21:29.297 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:30 compute-1 podman[219827]: 2025-12-03 00:21:30.215116747 +0000 UTC m=+0.054476474 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 03 00:21:31 compute-1 nova_compute[187157]: 2025-12-03 00:21:31.585 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:32 compute-1 nova_compute[187157]: 2025-12-03 00:21:32.803 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Creating tmpfile /var/lib/nova/instances/tmpzntdw8ou to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:21:32 compute-1 nova_compute[187157]: 2025-12-03 00:21:32.805 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:32 compute-1 nova_compute[187157]: 2025-12-03 00:21:32.874 187161 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:21:32 compute-1 nova_compute[187157]: 2025-12-03 00:21:32.923 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Creating tmpfile /var/lib/nova/instances/tmp7an9wy_h to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:21:32 compute-1 nova_compute[187157]: 2025-12-03 00:21:32.924 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:32 compute-1 nova_compute[187157]: 2025-12-03 00:21:32.926 187161 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:21:33 compute-1 ovn_controller[95464]: 2025-12-03T00:21:33Z|00249|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Dec 03 00:21:34 compute-1 nova_compute[187157]: 2025-12-03 00:21:34.300 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:34 compute-1 nova_compute[187157]: 2025-12-03 00:21:34.914 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:34 compute-1 nova_compute[187157]: 2025-12-03 00:21:34.982 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:35 compute-1 podman[197537]: time="2025-12-03T00:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:21:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:21:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Dec 03 00:21:36 compute-1 nova_compute[187157]: 2025-12-03 00:21:36.588 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:38 compute-1 nova_compute[187157]: 2025-12-03 00:21:38.819 187161 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7643810a-7499-484f-80e2-2a0a33cafc55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:21:39 compute-1 nova_compute[187157]: 2025-12-03 00:21:39.302 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:40 compute-1 podman[219848]: 2025-12-03 00:21:40.203296954 +0000 UTC m=+0.050108579 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:21:40 compute-1 nova_compute[187157]: 2025-12-03 00:21:40.230 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:21:40 compute-1 nova_compute[187157]: 2025-12-03 00:21:40.231 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:21:40 compute-1 nova_compute[187157]: 2025-12-03 00:21:40.231 187161 DEBUG nova.network.neutron [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:21:40 compute-1 nova_compute[187157]: 2025-12-03 00:21:40.739 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:41 compute-1 nova_compute[187157]: 2025-12-03 00:21:41.352 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:41 compute-1 nova_compute[187157]: 2025-12-03 00:21:41.481 187161 DEBUG nova.network.neutron [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating instance_info_cache with network_info: [{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:21:41 compute-1 nova_compute[187157]: 2025-12-03 00:21:41.602 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:42 compute-1 podman[219873]: 2025-12-03 00:21:42.196356135 +0000 UTC m=+0.043934689 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:21:42 compute-1 podman[219874]: 2025-12-03 00:21:42.229265894 +0000 UTC m=+0.074416969 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.250 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.264 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7643810a-7499-484f-80e2-2a0a33cafc55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.264 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Creating instance directory: /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.264 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Creating disk.info with the contents: {'/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk': 'qcow2', '/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.265 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.265 187161 DEBUG nova.objects.instance [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7643810a-7499-484f-80e2-2a0a33cafc55 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.771 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.776 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.777 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.839 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.840 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.840 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.841 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.844 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.845 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.896 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.897 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.926 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.927 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.927 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.974 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.975 187161 DEBUG nova.virt.disk.api [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:21:42 compute-1 nova_compute[187157]: 2025-12-03 00:21:42.975 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.024 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.024 187161 DEBUG nova.virt.disk.api [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.025 187161 DEBUG nova.objects.instance [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 7643810a-7499-484f-80e2-2a0a33cafc55 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.531 187161 DEBUG nova.objects.base [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<7643810a-7499-484f-80e2-2a0a33cafc55> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.532 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.557 187161 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config 497664" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.558 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.559 187161 DEBUG nova.virt.libvirt.vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137830',id=28,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-0dp6pegi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:20:46Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=7643810a-7499-484f-80e2-2a0a33cafc55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.559 187161 DEBUG nova.network.os_vif_util [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.561 187161 DEBUG nova.network.os_vif_util [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.561 187161 DEBUG os_vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.562 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.562 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.562 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.563 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.563 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c47d244b-3b23-5c14-b03c-8369e8c6b298', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.564 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.566 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.568 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.568 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96aba6d6-d4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.568 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap96aba6d6-d4, col_values=(('qos', UUID('a4ed2d98-b04c-4b34-aabe-a2b6daea767c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.569 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap96aba6d6-d4, col_values=(('external_ids', {'iface-id': '96aba6d6-d4d8-494d-9070-4ad5c1609fdf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:ff:2a', 'vm-uuid': '7643810a-7499-484f-80e2-2a0a33cafc55'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.570 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 NetworkManager[55553]: <info>  [1764721303.5714] manager: (tap96aba6d6-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.572 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.576 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.577 187161 INFO os_vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4')
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.577 187161 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.578 187161 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7643810a-7499-484f-80e2-2a0a33cafc55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.578 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:43 compute-1 nova_compute[187157]: 2025-12-03 00:21:43.972 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:44 compute-1 nova_compute[187157]: 2025-12-03 00:21:44.344 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:46 compute-1 nova_compute[187157]: 2025-12-03 00:21:46.755 187161 DEBUG nova.network.neutron [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:21:46 compute-1 nova_compute[187157]: 2025-12-03 00:21:46.766 187161 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7643810a-7499-484f-80e2-2a0a33cafc55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:21:48 compute-1 nova_compute[187157]: 2025-12-03 00:21:48.572 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:49 compute-1 nova_compute[187157]: 2025-12-03 00:21:49.402 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: ERROR   00:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: ERROR   00:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: ERROR   00:21:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: ERROR   00:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: ERROR   00:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:21:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:21:50 compute-1 kernel: tap96aba6d6-d4: entered promiscuous mode
Dec 03 00:21:50 compute-1 NetworkManager[55553]: <info>  [1764721310.7389] manager: (tap96aba6d6-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Dec 03 00:21:50 compute-1 nova_compute[187157]: 2025-12-03 00:21:50.739 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:50 compute-1 ovn_controller[95464]: 2025-12-03T00:21:50Z|00250|binding|INFO|Claiming lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf for this additional chassis.
Dec 03 00:21:50 compute-1 ovn_controller[95464]: 2025-12-03T00:21:50Z|00251|binding|INFO|96aba6d6-d4d8-494d-9070-4ad5c1609fdf: Claiming fa:16:3e:9a:ff:2a 10.100.0.8
Dec 03 00:21:50 compute-1 ovn_controller[95464]: 2025-12-03T00:21:50Z|00252|binding|INFO|Setting lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf ovn-installed in OVS
Dec 03 00:21:50 compute-1 nova_compute[187157]: 2025-12-03 00:21:50.752 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:50 compute-1 nova_compute[187157]: 2025-12-03 00:21:50.754 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:50 compute-1 systemd-udevd[219949]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:21:50 compute-1 NetworkManager[55553]: <info>  [1764721310.7786] device (tap96aba6d6-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:21:50 compute-1 NetworkManager[55553]: <info>  [1764721310.7798] device (tap96aba6d6-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:21:50 compute-1 systemd-machined[153454]: New machine qemu-23-instance-0000001c.
Dec 03 00:21:50 compute-1 systemd[1]: Started Virtual Machine qemu-23-instance-0000001c.
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.103 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:ff:2a 10.100.0.8'], port_security=['fa:16:3e:9a:ff:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7643810a-7499-484f-80e2-2a0a33cafc55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=96aba6d6-d4d8-494d-9070-4ad5c1609fdf) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.104 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.105 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.117 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c968ac-33e2-417b-8102-66f83fb65f32]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.117 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7ff943d-e1 in ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.119 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7ff943d-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.119 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[db5408d6-bcab-42da-a5eb-118cc80da946]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.120 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[25e1b310-7945-4681-ba5c-293d55d6c448]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.132 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[69e5ff83-d125-437f-b7b7-9e340151cb56]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.148 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[148a4c83-e6f8-47a5-8359-a66d3f1b8b24]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.178 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfbcba7-97c6-4d62-8d8f-7c5e4fc040da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.182 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cf445289-bcd8-490f-910b-bd74de3cd8a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 NetworkManager[55553]: <info>  [1764721311.1841] manager: (tapf7ff943d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Dec 03 00:21:51 compute-1 systemd-udevd[219953]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.215 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7ac6d9-8167-4301-ac85-5dfe48a2371b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.217 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[5892c2aa-2acb-4c20-b779-e0c15b6f21f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 NetworkManager[55553]: <info>  [1764721311.2399] device (tapf7ff943d-e0): carrier: link connected
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.248 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[0493a21b-3e79-421f-b6ed-9dbdd20c3505]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.263 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c88b1a-ee5a-466d-b78c-b7a8f8a93b26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533089, 'reachable_time': 19916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219984, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.278 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[67febd79-6631-42c8-a41d-0920a2c2ffc0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:9625'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533089, 'tstamp': 533089}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219987, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.292 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe3d844-3380-473c-8220-5bc092aeac32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533089, 'reachable_time': 19916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219992, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.326 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4b456152-9b90-43b9-8284-417661f010ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.380 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[efe300f7-74a4-4aab-8af6-da2f80339ac2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.382 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.382 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.382 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.384 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:51 compute-1 kernel: tapf7ff943d-e0: entered promiscuous mode
Dec 03 00:21:51 compute-1 NetworkManager[55553]: <info>  [1764721311.3847] manager: (tapf7ff943d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.385 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.386 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.387 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:51 compute-1 ovn_controller[95464]: 2025-12-03T00:21:51Z|00253|binding|INFO|Releasing lport 636cd919-869d-4a8a-92fa-ec7c18804da5 from this chassis (sb_readonly=0)
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.388 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.389 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c44fc73a-210d-4a8f-b431-da021a96bf9e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.389 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.389 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.390 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.390 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.390 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c93609-cb0c-48a7-9349-c23bcb0063ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.391 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.391 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c1c380-169f-4a72-a08c-a892d1bf8413]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.391 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:21:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:51.392 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'env', 'PROCESS_TAG=haproxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.398 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:51 compute-1 nova_compute[187157]: 2025-12-03 00:21:51.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:51 compute-1 podman[220026]: 2025-12-03 00:21:51.716797947 +0000 UTC m=+0.025577253 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:21:51 compute-1 podman[220026]: 2025-12-03 00:21:51.96096236 +0000 UTC m=+0.269741646 container create 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:21:52 compute-1 systemd[1]: Started libpod-conmon-63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7.scope.
Dec 03 00:21:52 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:21:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b2803fd9806c65b9a403c30d9cf57712ed3a38e6ae8d1833df1c5d6824f716/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:21:52 compute-1 podman[220026]: 2025-12-03 00:21:52.174929229 +0000 UTC m=+0.483708515 container init 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 00:21:52 compute-1 podman[220026]: 2025-12-03 00:21:52.186207844 +0000 UTC m=+0.494987130 container start 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:21:52 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [NOTICE]   (220058) : New worker (220060) forked
Dec 03 00:21:52 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [NOTICE]   (220058) : Loading success.
Dec 03 00:21:53 compute-1 nova_compute[187157]: 2025-12-03 00:21:53.240 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:53.240 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:21:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:53.242 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:21:53 compute-1 ovn_controller[95464]: 2025-12-03T00:21:53Z|00254|binding|INFO|Claiming lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf for this chassis.
Dec 03 00:21:53 compute-1 ovn_controller[95464]: 2025-12-03T00:21:53Z|00255|binding|INFO|96aba6d6-d4d8-494d-9070-4ad5c1609fdf: Claiming fa:16:3e:9a:ff:2a 10.100.0.8
Dec 03 00:21:53 compute-1 ovn_controller[95464]: 2025-12-03T00:21:53Z|00256|binding|INFO|Setting lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf up in Southbound
Dec 03 00:21:53 compute-1 nova_compute[187157]: 2025-12-03 00:21:53.574 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:54 compute-1 nova_compute[187157]: 2025-12-03 00:21:54.464 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.007 187161 INFO nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Post operation of migration started
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.008 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.165 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.166 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:21:55.243 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.305 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.306 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.306 187161 DEBUG nova.network.neutron [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:21:55 compute-1 nova_compute[187157]: 2025-12-03 00:21:55.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:56 compute-1 nova_compute[187157]: 2025-12-03 00:21:56.661 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:56 compute-1 nova_compute[187157]: 2025-12-03 00:21:56.661 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:56 compute-1 nova_compute[187157]: 2025-12-03 00:21:56.662 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:56 compute-1 nova_compute[187157]: 2025-12-03 00:21:56.662 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:21:56 compute-1 nova_compute[187157]: 2025-12-03 00:21:56.663 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:57 compute-1 nova_compute[187157]: 2025-12-03 00:21:57.835 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:57 compute-1 nova_compute[187157]: 2025-12-03 00:21:57.900 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:57 compute-1 nova_compute[187157]: 2025-12-03 00:21:57.902 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:57 compute-1 nova_compute[187157]: 2025-12-03 00:21:57.933 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:57 compute-1 nova_compute[187157]: 2025-12-03 00:21:57.956 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.095 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.096 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.110 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.111 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5701MB free_disk=73.13199996948242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.111 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.111 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:58 compute-1 podman[220078]: 2025-12-03 00:21:58.206971796 +0000 UTC m=+0.050762745 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Dec 03 00:21:58 compute-1 nova_compute[187157]: 2025-12-03 00:21:58.577 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:59 compute-1 nova_compute[187157]: 2025-12-03 00:21:59.040 187161 DEBUG nova.network.neutron [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating instance_info_cache with network_info: [{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:21:59 compute-1 nova_compute[187157]: 2025-12-03 00:21:59.478 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 7643810a-7499-484f-80e2-2a0a33cafc55 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:21:59 compute-1 nova_compute[187157]: 2025-12-03 00:21:59.478 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance bfaf8926-00b3-46a4-b85f-46ee074d049e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:21:59 compute-1 nova_compute[187157]: 2025-12-03 00:21:59.499 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:59 compute-1 nova_compute[187157]: 2025-12-03 00:21:59.546 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:22:00 compute-1 nova_compute[187157]: 2025-12-03 00:22:00.063 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:01 compute-1 podman[220100]: 2025-12-03 00:22:01.228240421 +0000 UTC m=+0.057825196 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:22:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:01.749 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:01.749 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:01.749 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:01 compute-1 nova_compute[187157]: 2025-12-03 00:22:01.793 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating resource usage from migration 232c77b8-b3ca-453e-acab-98823e5c2a0a
Dec 03 00:22:01 compute-1 nova_compute[187157]: 2025-12-03 00:22:01.793 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Starting to track incoming migration 232c77b8-b3ca-453e-acab-98823e5c2a0a with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:22:02 compute-1 nova_compute[187157]: 2025-12-03 00:22:02.302 187161 INFO nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating resource usage from migration 2676e5ea-14e0-4423-bac9-b4312d7935f8
Dec 03 00:22:02 compute-1 nova_compute[187157]: 2025-12-03 00:22:02.303 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Starting to track incoming migration 2676e5ea-14e0-4423-bac9-b4312d7935f8 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 03 00:22:03 compute-1 nova_compute[187157]: 2025-12-03 00:22:03.344 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance bfaf8926-00b3-46a4-b85f-46ee074d049e has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:22:03 compute-1 nova_compute[187157]: 2025-12-03 00:22:03.579 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:03 compute-1 nova_compute[187157]: 2025-12-03 00:22:03.850 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 7643810a-7499-484f-80e2-2a0a33cafc55 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 03 00:22:03 compute-1 nova_compute[187157]: 2025-12-03 00:22:03.850 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:22:03 compute-1 nova_compute[187157]: 2025-12-03 00:22:03.851 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:21:58 up  1:28,  0 user,  load average: 0.23, 0.17, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:22:03 compute-1 nova_compute[187157]: 2025-12-03 00:22:03.902 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.418 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.547 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.931 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.932 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.821s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.932 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 4.869s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.932 187161 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:04 compute-1 nova_compute[187157]: 2025-12-03 00:22:04.936 187161 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:22:04 compute-1 virtqemud[186882]: Domain id=23 name='instance-0000001c' uuid=7643810a-7499-484f-80e2-2a0a33cafc55 is tainted: custom-monitor
Dec 03 00:22:05 compute-1 podman[197537]: time="2025-12-03T00:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:22:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:22:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Dec 03 00:22:05 compute-1 nova_compute[187157]: 2025-12-03 00:22:05.944 187161 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:22:06 compute-1 nova_compute[187157]: 2025-12-03 00:22:06.949 187161 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:22:06 compute-1 nova_compute[187157]: 2025-12-03 00:22:06.954 187161 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:22:07 compute-1 nova_compute[187157]: 2025-12-03 00:22:07.464 187161 DEBUG nova.objects.instance [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:22:07 compute-1 nova_compute[187157]: 2025-12-03 00:22:07.928 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:07 compute-1 nova_compute[187157]: 2025-12-03 00:22:07.929 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:07 compute-1 nova_compute[187157]: 2025-12-03 00:22:07.929 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:07 compute-1 nova_compute[187157]: 2025-12-03 00:22:07.929 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:07 compute-1 nova_compute[187157]: 2025-12-03 00:22:07.929 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:22:08 compute-1 nova_compute[187157]: 2025-12-03 00:22:08.482 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:08 compute-1 nova_compute[187157]: 2025-12-03 00:22:08.581 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:08 compute-1 nova_compute[187157]: 2025-12-03 00:22:08.841 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:08 compute-1 nova_compute[187157]: 2025-12-03 00:22:08.842 187161 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:09 compute-1 nova_compute[187157]: 2025-12-03 00:22:09.548 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:11 compute-1 podman[220121]: 2025-12-03 00:22:11.20160549 +0000 UTC m=+0.048487580 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:22:13 compute-1 podman[220147]: 2025-12-03 00:22:13.204247632 +0000 UTC m=+0.046113021 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Dec 03 00:22:13 compute-1 podman[220148]: 2025-12-03 00:22:13.240271258 +0000 UTC m=+0.074700347 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true)
Dec 03 00:22:13 compute-1 nova_compute[187157]: 2025-12-03 00:22:13.583 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:14 compute-1 nova_compute[187157]: 2025-12-03 00:22:14.550 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.356 187161 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bfaf8926-00b3-46a4-b85f-46ee074d049e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.700 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.700 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.701 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.701 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.701 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:17 compute-1 nova_compute[187157]: 2025-12-03 00:22:17.701 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.371 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.372 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.372 187161 DEBUG nova.network.neutron [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.635 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.718 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.719 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Image id 92e79321-71af-44a0-869c-1d5a9da5fefc yields fingerprint 4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.720 187161 INFO nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] image 92e79321-71af-44a0-869c-1d5a9da5fefc at (/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0): checking
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.720 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] image 92e79321-71af-44a0-869c-1d5a9da5fefc at (/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0): image is in use _mark_in_use /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:279
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.724 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.724 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] 7643810a-7499-484f-80e2-2a0a33cafc55 is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.724 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] 7643810a-7499-484f-80e2-2a0a33cafc55 has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.725 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.773 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.774 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 7643810a-7499-484f-80e2-2a0a33cafc55 is backed by 4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.774 187161 INFO nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Active base files: /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.775 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.775 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.775 187161 DEBUG nova.virt.libvirt.imagecache [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Dec 03 00:22:18 compute-1 nova_compute[187157]: 2025-12-03 00:22:18.881 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: ERROR   00:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: ERROR   00:22:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: ERROR   00:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: ERROR   00:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: ERROR   00:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:22:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:22:19 compute-1 nova_compute[187157]: 2025-12-03 00:22:19.424 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:19 compute-1 nova_compute[187157]: 2025-12-03 00:22:19.553 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:20 compute-1 nova_compute[187157]: 2025-12-03 00:22:20.116 187161 DEBUG nova.network.neutron [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:22:20 compute-1 nova_compute[187157]: 2025-12-03 00:22:20.623 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:22:21 compute-1 nova_compute[187157]: 2025-12-03 00:22:21.231 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bfaf8926-00b3-46a4-b85f-46ee074d049e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:22:21 compute-1 nova_compute[187157]: 2025-12-03 00:22:21.232 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Creating instance directory: /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:22:21 compute-1 nova_compute[187157]: 2025-12-03 00:22:21.233 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Creating disk.info with the contents: {'/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk': 'qcow2', '/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:22:21 compute-1 nova_compute[187157]: 2025-12-03 00:22:21.233 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:22:21 compute-1 nova_compute[187157]: 2025-12-03 00:22:21.234 187161 DEBUG nova.objects.instance [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid bfaf8926-00b3-46a4-b85f-46ee074d049e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.268 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.273 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.274 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.358 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.359 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.360 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.360 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.364 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.365 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.423 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:22 compute-1 nova_compute[187157]: 2025-12-03 00:22:22.424 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:23 compute-1 ovn_controller[95464]: 2025-12-03T00:22:23Z|00257|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.639 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.648 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk 1073741824" returned: 0 in 1.224s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.649 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.289s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.649 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.703 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.704 187161 DEBUG nova.virt.disk.api [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.705 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.758 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.759 187161 DEBUG nova.virt.disk.api [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:22:23 compute-1 nova_compute[187157]: 2025-12-03 00:22:23.760 187161 DEBUG nova.objects.instance [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid bfaf8926-00b3-46a4-b85f-46ee074d049e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.269 187161 DEBUG nova.objects.base [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<bfaf8926-00b3-46a4-b85f-46ee074d049e> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.270 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.293 187161 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config 497664" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.294 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.296 187161 DEBUG nova.virt.libvirt.vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-3508609',id=29,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:21:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-7vu1e8y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:21:05Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.296 187161 DEBUG nova.network.os_vif_util [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.297 187161 DEBUG nova.network.os_vif_util [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.297 187161 DEBUG os_vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.298 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.298 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.299 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.299 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.300 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1dbe0aa4-c204-536d-925e-647a657a5e0f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.301 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.302 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.305 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.305 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd61e9e8-f7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.306 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbd61e9e8-f7, col_values=(('qos', UUID('97dc31ac-a220-4f90-8df4-0d2b2282eb37')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.306 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbd61e9e8-f7, col_values=(('external_ids', {'iface-id': 'bd61e9e8-f7f0-458d-858f-ffb409383310', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:5f:29', 'vm-uuid': 'bfaf8926-00b3-46a4-b85f-46ee074d049e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.307 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 NetworkManager[55553]: <info>  [1764721344.3081] manager: (tapbd61e9e8-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.309 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.313 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.314 187161 INFO os_vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7')
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.314 187161 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.314 187161 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bfaf8926-00b3-46a4-b85f-46ee074d049e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.315 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.599 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-1 nova_compute[187157]: 2025-12-03 00:22:24.615 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:27 compute-1 nova_compute[187157]: 2025-12-03 00:22:27.096 187161 DEBUG nova.network.neutron [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Port bd61e9e8-f7f0-458d-858f-ffb409383310 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:22:27 compute-1 nova_compute[187157]: 2025-12-03 00:22:27.112 187161 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bfaf8926-00b3-46a4-b85f-46ee074d049e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:22:29 compute-1 podman[220214]: 2025-12-03 00:22:29.223368557 +0000 UTC m=+0.065691297 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Dec 03 00:22:29 compute-1 nova_compute[187157]: 2025-12-03 00:22:29.343 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:29 compute-1 nova_compute[187157]: 2025-12-03 00:22:29.602 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:30 compute-1 kernel: tapbd61e9e8-f7: entered promiscuous mode
Dec 03 00:22:30 compute-1 nova_compute[187157]: 2025-12-03 00:22:30.168 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:30 compute-1 ovn_controller[95464]: 2025-12-03T00:22:30Z|00258|binding|INFO|Claiming lport bd61e9e8-f7f0-458d-858f-ffb409383310 for this additional chassis.
Dec 03 00:22:30 compute-1 ovn_controller[95464]: 2025-12-03T00:22:30Z|00259|binding|INFO|bd61e9e8-f7f0-458d-858f-ffb409383310: Claiming fa:16:3e:03:5f:29 10.100.0.13
Dec 03 00:22:30 compute-1 NetworkManager[55553]: <info>  [1764721350.1703] manager: (tapbd61e9e8-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Dec 03 00:22:30 compute-1 ovn_controller[95464]: 2025-12-03T00:22:30Z|00260|binding|INFO|Setting lport bd61e9e8-f7f0-458d-858f-ffb409383310 ovn-installed in OVS
Dec 03 00:22:30 compute-1 nova_compute[187157]: 2025-12-03 00:22:30.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:30 compute-1 nova_compute[187157]: 2025-12-03 00:22:30.201 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:30 compute-1 systemd-udevd[220248]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:22:30 compute-1 systemd-machined[153454]: New machine qemu-24-instance-0000001d.
Dec 03 00:22:30 compute-1 NetworkManager[55553]: <info>  [1764721350.2207] device (tapbd61e9e8-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:22:30 compute-1 NetworkManager[55553]: <info>  [1764721350.2224] device (tapbd61e9e8-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:22:30 compute-1 systemd[1]: Started Virtual Machine qemu-24-instance-0000001d.
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.471 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5f:29 10.100.0.13'], port_security=['fa:16:3e:03:5f:29 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'bfaf8926-00b3-46a4-b85f-46ee074d049e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=bd61e9e8-f7f0-458d-858f-ffb409383310) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.475 104348 INFO neutron.agent.ovn.metadata.agent [-] Port bd61e9e8-f7f0-458d-858f-ffb409383310 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.476 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.493 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ca8a162b-4ce2-46c0-80b0-b4705cb0c292]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.528 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8210027a-9db6-40ec-a5cb-35401f4bdc70]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.716 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[567d4351-e554-4be1-b1ad-4aac3c8cbae1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.741 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[c0897fe4-2c3e-43f6-b3bf-83c3221fecfd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.755 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b70f5948-fb43-4b1c-adb6-12774398b925]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533089, 'reachable_time': 19916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220271, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.771 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[90a5d06b-15e1-4e85-bdbc-cfba005829e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533100, 'tstamp': 533100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220272, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533103, 'tstamp': 533103}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220272, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.773 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:30 compute-1 nova_compute[187157]: 2025-12-03 00:22:30.774 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:30 compute-1 nova_compute[187157]: 2025-12-03 00:22:30.775 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.776 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.776 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.776 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.776 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:22:30 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:30.777 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[cac6aadc-1b6d-4eb3-b71a-a3dcbf3e16c8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-1 podman[220289]: 2025-12-03 00:22:32.224296898 +0000 UTC m=+0.066308652 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:22:33 compute-1 ovn_controller[95464]: 2025-12-03T00:22:33Z|00261|binding|INFO|Claiming lport bd61e9e8-f7f0-458d-858f-ffb409383310 for this chassis.
Dec 03 00:22:33 compute-1 ovn_controller[95464]: 2025-12-03T00:22:33Z|00262|binding|INFO|bd61e9e8-f7f0-458d-858f-ffb409383310: Claiming fa:16:3e:03:5f:29 10.100.0.13
Dec 03 00:22:33 compute-1 ovn_controller[95464]: 2025-12-03T00:22:33Z|00263|binding|INFO|Setting lport bd61e9e8-f7f0-458d-858f-ffb409383310 up in Southbound
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.382 187161 INFO nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Post operation of migration started
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.383 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.385 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.486 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.487 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.604 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.754 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.754 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:22:34 compute-1 nova_compute[187157]: 2025-12-03 00:22:34.755 187161 DEBUG nova.network.neutron [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:22:35 compute-1 nova_compute[187157]: 2025-12-03 00:22:35.263 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:35 compute-1 podman[197537]: time="2025-12-03T00:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:22:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:22:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3082 "" "Go-http-client/1.1"
Dec 03 00:22:38 compute-1 nova_compute[187157]: 2025-12-03 00:22:38.061 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:38 compute-1 nova_compute[187157]: 2025-12-03 00:22:38.245 187161 DEBUG nova.network.neutron [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:22:38 compute-1 nova_compute[187157]: 2025-12-03 00:22:38.751 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:22:39 compute-1 nova_compute[187157]: 2025-12-03 00:22:39.286 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:39 compute-1 nova_compute[187157]: 2025-12-03 00:22:39.287 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:39 compute-1 nova_compute[187157]: 2025-12-03 00:22:39.287 187161 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:39 compute-1 nova_compute[187157]: 2025-12-03 00:22:39.292 187161 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:22:39 compute-1 virtqemud[186882]: Domain id=24 name='instance-0000001d' uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e is tainted: custom-monitor
Dec 03 00:22:39 compute-1 nova_compute[187157]: 2025-12-03 00:22:39.403 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:39 compute-1 nova_compute[187157]: 2025-12-03 00:22:39.606 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:40 compute-1 nova_compute[187157]: 2025-12-03 00:22:40.298 187161 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:22:41 compute-1 nova_compute[187157]: 2025-12-03 00:22:41.306 187161 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:22:41 compute-1 nova_compute[187157]: 2025-12-03 00:22:41.311 187161 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:22:41 compute-1 nova_compute[187157]: 2025-12-03 00:22:41.834 187161 DEBUG nova.objects.instance [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:22:42 compute-1 podman[220310]: 2025-12-03 00:22:42.239898334 +0000 UTC m=+0.067821850 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:22:42 compute-1 nova_compute[187157]: 2025-12-03 00:22:42.853 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:43 compute-1 nova_compute[187157]: 2025-12-03 00:22:43.023 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:43 compute-1 nova_compute[187157]: 2025-12-03 00:22:43.024 187161 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:44 compute-1 podman[220333]: 2025-12-03 00:22:44.236710245 +0000 UTC m=+0.077216757 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:22:44 compute-1 podman[220334]: 2025-12-03 00:22:44.294337316 +0000 UTC m=+0.129274752 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:22:44 compute-1 nova_compute[187157]: 2025-12-03 00:22:44.406 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:44 compute-1 nova_compute[187157]: 2025-12-03 00:22:44.608 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:46 compute-1 nova_compute[187157]: 2025-12-03 00:22:46.852 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:46 compute-1 nova_compute[187157]: 2025-12-03 00:22:46.853 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:46 compute-1 nova_compute[187157]: 2025-12-03 00:22:46.853 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:46 compute-1 nova_compute[187157]: 2025-12-03 00:22:46.854 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:46 compute-1 nova_compute[187157]: 2025-12-03 00:22:46.854 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:46 compute-1 nova_compute[187157]: 2025-12-03 00:22:46.866 187161 INFO nova.compute.manager [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Terminating instance
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.383 187161 DEBUG nova.compute.manager [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:22:47 compute-1 kernel: tapbd61e9e8-f7 (unregistering): left promiscuous mode
Dec 03 00:22:47 compute-1 NetworkManager[55553]: <info>  [1764721367.4080] device (tapbd61e9e8-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.415 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 ovn_controller[95464]: 2025-12-03T00:22:47Z|00264|binding|INFO|Releasing lport bd61e9e8-f7f0-458d-858f-ffb409383310 from this chassis (sb_readonly=0)
Dec 03 00:22:47 compute-1 ovn_controller[95464]: 2025-12-03T00:22:47Z|00265|binding|INFO|Setting lport bd61e9e8-f7f0-458d-858f-ffb409383310 down in Southbound
Dec 03 00:22:47 compute-1 ovn_controller[95464]: 2025-12-03T00:22:47Z|00266|binding|INFO|Removing iface tapbd61e9e8-f7 ovn-installed in OVS
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.417 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.421 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5f:29 10.100.0.13'], port_security=['fa:16:3e:03:5f:29 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'bfaf8926-00b3-46a4-b85f-46ee074d049e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=bd61e9e8-f7f0-458d-858f-ffb409383310) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.422 104348 INFO neutron.agent.ovn.metadata.agent [-] Port bd61e9e8-f7f0-458d-858f-ffb409383310 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.423 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.429 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.437 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a16767ec-4728-4c54-a0ed-0827b2d15eb6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.463 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[b73d2c79-df27-4bd1-a912-6079374ac48a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.466 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[8c54dd33-7f4d-4a8a-9c9a-7c62a0058dae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Dec 03 00:22:47 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001d.scope: Consumed 1.732s CPU time.
Dec 03 00:22:47 compute-1 systemd-machined[153454]: Machine qemu-24-instance-0000001d terminated.
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.486 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[12e5c75a-9e7d-4d2d-8209-a1ea233c58a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.499 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad86944-a9a3-4b19-b381-896eaf35eb10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533089, 'reachable_time': 19916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220388, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.514 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc13f51-99ed-4e2a-af1d-584c81a3060f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533100, 'tstamp': 533100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220389, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533103, 'tstamp': 533103}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220389, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.515 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.516 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.519 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.520 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.520 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.520 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.520 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:22:47 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:47.521 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[494485d7-63d2-46d4-a6fd-a803fcd69c3d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.586 187161 DEBUG nova.compute.manager [req-36bb2898-9092-4a44-8460-6bc169a458ef req-5c216502-51a8-4633-8148-11b28f428003 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.586 187161 DEBUG oslo_concurrency.lockutils [req-36bb2898-9092-4a44-8460-6bc169a458ef req-5c216502-51a8-4633-8148-11b28f428003 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.587 187161 DEBUG oslo_concurrency.lockutils [req-36bb2898-9092-4a44-8460-6bc169a458ef req-5c216502-51a8-4633-8148-11b28f428003 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.587 187161 DEBUG oslo_concurrency.lockutils [req-36bb2898-9092-4a44-8460-6bc169a458ef req-5c216502-51a8-4633-8148-11b28f428003 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.588 187161 DEBUG nova.compute.manager [req-36bb2898-9092-4a44-8460-6bc169a458ef req-5c216502-51a8-4633-8148-11b28f428003 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.588 187161 DEBUG nova.compute.manager [req-36bb2898-9092-4a44-8460-6bc169a458ef req-5c216502-51a8-4633-8148-11b28f428003 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.609 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.615 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.648 187161 INFO nova.virt.libvirt.driver [-] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance destroyed successfully.
Dec 03 00:22:47 compute-1 nova_compute[187157]: 2025-12-03 00:22:47.648 187161 DEBUG nova.objects.instance [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'resources' on Instance uuid bfaf8926-00b3-46a4-b85f-46ee074d049e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.154 187161 DEBUG nova.virt.libvirt.vif [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-3508609',id=29,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:21:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-7vu1e8y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:22:42Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.155 187161 DEBUG nova.network.os_vif_util [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.156 187161 DEBUG nova.network.os_vif_util [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.156 187161 DEBUG os_vif [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.158 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.159 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd61e9e8-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.160 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.164 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.165 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.165 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=97dc31ac-a220-4f90-8df4-0d2b2282eb37) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.166 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.167 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.169 187161 INFO os_vif [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7')
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.169 187161 INFO nova.virt.libvirt.driver [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Deleting instance files /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e_del
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.170 187161 INFO nova.virt.libvirt.driver [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Deletion of /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e_del complete
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.681 187161 INFO nova.compute.manager [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Took 1.30 seconds to destroy the instance on the hypervisor.
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.682 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.682 187161 DEBUG nova.compute.manager [-] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.682 187161 DEBUG nova.network.neutron [-] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.682 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:48 compute-1 nova_compute[187157]: 2025-12-03 00:22:48.850 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.149 187161 DEBUG nova.compute.manager [req-261340ba-8069-4fbd-a6fc-3b8f3726842e req-a47f1a43-dc04-44ef-ad59-04b7ead13f06 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-deleted-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.150 187161 INFO nova.compute.manager [req-261340ba-8069-4fbd-a6fc-3b8f3726842e req-a47f1a43-dc04-44ef-ad59-04b7ead13f06 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Neutron deleted interface bd61e9e8-f7f0-458d-858f-ffb409383310; detaching it from the instance and deleting it from the info cache
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.150 187161 DEBUG nova.network.neutron [req-261340ba-8069-4fbd-a6fc-3b8f3726842e req-a47f1a43-dc04-44ef-ad59-04b7ead13f06 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: ERROR   00:22:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: ERROR   00:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: ERROR   00:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: ERROR   00:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: ERROR   00:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:22:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.601 187161 DEBUG nova.network.neutron [-] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.653 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.657 187161 DEBUG nova.compute.manager [req-261340ba-8069-4fbd-a6fc-3b8f3726842e req-a47f1a43-dc04-44ef-ad59-04b7ead13f06 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Detach interface failed, port_id=bd61e9e8-f7f0-458d-858f-ffb409383310, reason: Instance bfaf8926-00b3-46a4-b85f-46ee074d049e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.690 187161 DEBUG nova.compute.manager [req-af64be97-9caa-41a9-a2cb-56e1cce15ed3 req-1c1f6de8-b45a-4a3f-842b-bc211f16fe30 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.691 187161 DEBUG oslo_concurrency.lockutils [req-af64be97-9caa-41a9-a2cb-56e1cce15ed3 req-1c1f6de8-b45a-4a3f-842b-bc211f16fe30 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.692 187161 DEBUG oslo_concurrency.lockutils [req-af64be97-9caa-41a9-a2cb-56e1cce15ed3 req-1c1f6de8-b45a-4a3f-842b-bc211f16fe30 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.693 187161 DEBUG oslo_concurrency.lockutils [req-af64be97-9caa-41a9-a2cb-56e1cce15ed3 req-1c1f6de8-b45a-4a3f-842b-bc211f16fe30 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.693 187161 DEBUG nova.compute.manager [req-af64be97-9caa-41a9-a2cb-56e1cce15ed3 req-1c1f6de8-b45a-4a3f-842b-bc211f16fe30 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:49 compute-1 nova_compute[187157]: 2025-12-03 00:22:49.693 187161 DEBUG nova.compute.manager [req-af64be97-9caa-41a9-a2cb-56e1cce15ed3 req-1c1f6de8-b45a-4a3f-842b-bc211f16fe30 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:50 compute-1 nova_compute[187157]: 2025-12-03 00:22:50.182 187161 INFO nova.compute.manager [-] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Took 1.50 seconds to deallocate network for instance.
Dec 03 00:22:50 compute-1 nova_compute[187157]: 2025-12-03 00:22:50.703 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:50 compute-1 nova_compute[187157]: 2025-12-03 00:22:50.704 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:50 compute-1 nova_compute[187157]: 2025-12-03 00:22:50.710 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:50 compute-1 nova_compute[187157]: 2025-12-03 00:22:50.751 187161 INFO nova.scheduler.client.report [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Deleted allocations for instance bfaf8926-00b3-46a4-b85f-46ee074d049e
Dec 03 00:22:51 compute-1 nova_compute[187157]: 2025-12-03 00:22:51.786 187161 DEBUG oslo_concurrency.lockutils [None req-a2707630-0302-4c8d-8a67-bf0bf1a63879 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.933s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.775 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.776 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.821 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.821 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.821 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.822 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.822 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:52 compute-1 nova_compute[187157]: 2025-12-03 00:22:52.838 187161 INFO nova.compute.manager [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Terminating instance
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.228 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.357 187161 DEBUG nova.compute.manager [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:22:53 compute-1 kernel: tap96aba6d6-d4 (unregistering): left promiscuous mode
Dec 03 00:22:53 compute-1 NetworkManager[55553]: <info>  [1764721373.3903] device (tap96aba6d6-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:22:53 compute-1 ovn_controller[95464]: 2025-12-03T00:22:53Z|00267|binding|INFO|Releasing lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf from this chassis (sb_readonly=0)
Dec 03 00:22:53 compute-1 ovn_controller[95464]: 2025-12-03T00:22:53Z|00268|binding|INFO|Setting lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf down in Southbound
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.391 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:53 compute-1 ovn_controller[95464]: 2025-12-03T00:22:53Z|00269|binding|INFO|Removing iface tap96aba6d6-d4 ovn-installed in OVS
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.393 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.407 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:ff:2a 10.100.0.8'], port_security=['fa:16:3e:9a:ff:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7643810a-7499-484f-80e2-2a0a33cafc55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=96aba6d6-d4d8-494d-9070-4ad5c1609fdf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.408 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.409 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.410 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.411 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a69783-5d8f-42be-878c-beef52d64c10]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.412 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace which is not needed anymore
Dec 03 00:22:53 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Dec 03 00:22:53 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Consumed 3.868s CPU time.
Dec 03 00:22:53 compute-1 systemd-machined[153454]: Machine qemu-23-instance-0000001c terminated.
Dec 03 00:22:53 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [NOTICE]   (220058) : haproxy version is 3.0.5-8e879a5
Dec 03 00:22:53 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [NOTICE]   (220058) : path to executable is /usr/sbin/haproxy
Dec 03 00:22:53 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [WARNING]  (220058) : Exiting Master process...
Dec 03 00:22:53 compute-1 podman[220432]: 2025-12-03 00:22:53.527845176 +0000 UTC m=+0.032603194 container kill 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:22:53 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [ALERT]    (220058) : Current worker (220060) exited with code 143 (Terminated)
Dec 03 00:22:53 compute-1 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220054]: [WARNING]  (220058) : All workers exited. Exiting... (0)
Dec 03 00:22:53 compute-1 systemd[1]: libpod-63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7.scope: Deactivated successfully.
Dec 03 00:22:53 compute-1 podman[220447]: 2025-12-03 00:22:53.564834205 +0000 UTC m=+0.024016115 container died 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:22:53 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7-userdata-shm.mount: Deactivated successfully.
Dec 03 00:22:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-83b2803fd9806c65b9a403c30d9cf57712ed3a38e6ae8d1833df1c5d6824f716-merged.mount: Deactivated successfully.
Dec 03 00:22:53 compute-1 podman[220447]: 2025-12-03 00:22:53.610004632 +0000 UTC m=+0.069186522 container cleanup 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:22:53 compute-1 systemd[1]: libpod-conmon-63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7.scope: Deactivated successfully.
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.623 187161 INFO nova.virt.libvirt.driver [-] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance destroyed successfully.
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.624 187161 DEBUG nova.objects.instance [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'resources' on Instance uuid 7643810a-7499-484f-80e2-2a0a33cafc55 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:22:53 compute-1 podman[220454]: 2025-12-03 00:22:53.62637936 +0000 UTC m=+0.062373257 container remove 63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.631 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[61349c09-e2dd-4eed-a986-6c1d0bcbbb01]: (4, ("Wed Dec  3 12:22:53 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7)\n63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7\nWed Dec  3 12:22:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7)\n63f8f13c363b919068d7005d209cbf03733b7966c46d8a2561731e81f6fcbda7\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.632 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c03cc6e4-ad13-44d3-87c1-450155c1cc07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.632 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.633 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5d399bb9-c6b7-4333-8d4a-1ba7329180fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.633 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.635 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:53 compute-1 kernel: tapf7ff943d-e0: left promiscuous mode
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.649 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.650 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[71233c44-0d06-4c53-a52f-8369bd1adf8c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.667 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2a363e33-379d-4a50-b96b-856b8b9e9d0c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.668 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[74dcdd58-31b4-45cf-896e-0b44dc791d38]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.681 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b43bf6-3cf7-4c62-abfe-23d0dcba4913]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533082, 'reachable_time': 44374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220498, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.683 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:22:53 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:53.683 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c9675a-fcc3-4278-bcce-01ebe8e93303]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:53 compute-1 systemd[1]: run-netns-ovnmeta\x2df7ff943d\x2de57d\x2d4bc2\x2d8dd6\x2df8a8bb6e4f89.mount: Deactivated successfully.
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:53 compute-1 nova_compute[187157]: 2025-12-03 00:22:53.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.132 187161 DEBUG nova.virt.libvirt.vif [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137830',id=28,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-0dp6pegi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:22:08Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=7643810a-7499-484f-80e2-2a0a33cafc55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.133 187161 DEBUG nova.network.os_vif_util [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.134 187161 DEBUG nova.network.os_vif_util [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.134 187161 DEBUG os_vif [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.136 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.136 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96aba6d6-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.137 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.139 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.140 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.140 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a4ed2d98-b04c-4b34-aabe-a2b6daea767c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.141 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.142 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.144 187161 INFO os_vif [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4')
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.144 187161 INFO nova.virt.libvirt.driver [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Deleting instance files /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55_del
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.145 187161 INFO nova.virt.libvirt.driver [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Deletion of /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55_del complete
Dec 03 00:22:54 compute-1 nova_compute[187157]: 2025-12-03 00:22:54.688 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.068 187161 DEBUG nova.compute.manager [req-214fdaff-13df-4c9a-a31a-a49606bb6889 req-dcf1a803-8f07-4a52-96f2-c64ce516bf6d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.069 187161 DEBUG oslo_concurrency.lockutils [req-214fdaff-13df-4c9a-a31a-a49606bb6889 req-dcf1a803-8f07-4a52-96f2-c64ce516bf6d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.069 187161 DEBUG oslo_concurrency.lockutils [req-214fdaff-13df-4c9a-a31a-a49606bb6889 req-dcf1a803-8f07-4a52-96f2-c64ce516bf6d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.070 187161 DEBUG oslo_concurrency.lockutils [req-214fdaff-13df-4c9a-a31a-a49606bb6889 req-dcf1a803-8f07-4a52-96f2-c64ce516bf6d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.070 187161 DEBUG nova.compute.manager [req-214fdaff-13df-4c9a-a31a-a49606bb6889 req-dcf1a803-8f07-4a52-96f2-c64ce516bf6d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.070 187161 DEBUG nova.compute.manager [req-214fdaff-13df-4c9a-a31a-a49606bb6889 req-dcf1a803-8f07-4a52-96f2-c64ce516bf6d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.161 187161 INFO nova.compute.manager [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Took 1.80 seconds to destroy the instance on the hypervisor.
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.161 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.162 187161 DEBUG nova.compute.manager [-] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.162 187161 DEBUG nova.network.neutron [-] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:22:55 compute-1 nova_compute[187157]: 2025-12-03 00:22:55.162 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:56 compute-1 nova_compute[187157]: 2025-12-03 00:22:56.028 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:56.060 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:22:56 compute-1 nova_compute[187157]: 2025-12-03 00:22:56.060 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:56 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:22:56.061 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:22:56 compute-1 nova_compute[187157]: 2025-12-03 00:22:56.207 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:56 compute-1 nova_compute[187157]: 2025-12-03 00:22:56.208 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.144 187161 DEBUG nova.compute.manager [req-e6e6c40b-6635-44a5-98e3-4195c82fdacd req-77905004-d859-407d-a944-acbc183704a3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.144 187161 DEBUG oslo_concurrency.lockutils [req-e6e6c40b-6635-44a5-98e3-4195c82fdacd req-77905004-d859-407d-a944-acbc183704a3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.144 187161 DEBUG oslo_concurrency.lockutils [req-e6e6c40b-6635-44a5-98e3-4195c82fdacd req-77905004-d859-407d-a944-acbc183704a3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.145 187161 DEBUG oslo_concurrency.lockutils [req-e6e6c40b-6635-44a5-98e3-4195c82fdacd req-77905004-d859-407d-a944-acbc183704a3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.145 187161 DEBUG nova.compute.manager [req-e6e6c40b-6635-44a5-98e3-4195c82fdacd req-77905004-d859-407d-a944-acbc183704a3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.145 187161 DEBUG nova.compute.manager [req-e6e6c40b-6635-44a5-98e3-4195c82fdacd req-77905004-d859-407d-a944-acbc183704a3 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.207 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.719 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.720 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.720 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.720 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.851 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.852 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.883 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.883 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5778MB free_disk=73.16104888916016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.883 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:57 compute-1 nova_compute[187157]: 2025-12-03 00:22:57.884 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:58 compute-1 nova_compute[187157]: 2025-12-03 00:22:58.488 187161 DEBUG nova.network.neutron [-] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:22:58 compute-1 nova_compute[187157]: 2025-12-03 00:22:58.932 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 7643810a-7499-484f-80e2-2a0a33cafc55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:22:58 compute-1 nova_compute[187157]: 2025-12-03 00:22:58.932 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:22:58 compute-1 nova_compute[187157]: 2025-12-03 00:22:58.933 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:22:57 up  1:29,  0 user,  load average: 0.14, 0.15, 0.24\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:22:58 compute-1 nova_compute[187157]: 2025-12-03 00:22:58.961 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:22:58 compute-1 nova_compute[187157]: 2025-12-03 00:22:58.994 187161 INFO nova.compute.manager [-] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Took 3.83 seconds to deallocate network for instance.
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.188 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.209 187161 DEBUG nova.compute.manager [req-9d8582f2-46a7-46e4-a9ec-5777c3df5795 req-9add4835-b205-4c28-9fc0-b4cb9d38368b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-deleted-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.469 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.515 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.691 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.985 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.986 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:59 compute-1 nova_compute[187157]: 2025-12-03 00:22:59.986 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.471s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:00 compute-1 nova_compute[187157]: 2025-12-03 00:23:00.029 187161 DEBUG nova.compute.provider_tree [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:23:00 compute-1 podman[220502]: 2025-12-03 00:23:00.260375154 +0000 UTC m=+0.090355037 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41)
Dec 03 00:23:00 compute-1 nova_compute[187157]: 2025-12-03 00:23:00.536 187161 DEBUG nova.scheduler.client.report [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:23:01 compute-1 nova_compute[187157]: 2025-12-03 00:23:01.268 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.282s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:01 compute-1 nova_compute[187157]: 2025-12-03 00:23:01.308 187161 INFO nova.scheduler.client.report [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Deleted allocations for instance 7643810a-7499-484f-80e2-2a0a33cafc55
Dec 03 00:23:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:01.750 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:01.750 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:01.751 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:02 compute-1 nova_compute[187157]: 2025-12-03 00:23:02.336 187161 DEBUG oslo_concurrency.lockutils [None req-ab4145d3-f8e8-493f-a2aa-057c4322a366 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.515s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:03 compute-1 podman[220525]: 2025-12-03 00:23:03.220303538 +0000 UTC m=+0.066231381 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:23:03 compute-1 nova_compute[187157]: 2025-12-03 00:23:03.474 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:03 compute-1 nova_compute[187157]: 2025-12-03 00:23:03.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:03 compute-1 nova_compute[187157]: 2025-12-03 00:23:03.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:23:04 compute-1 nova_compute[187157]: 2025-12-03 00:23:04.190 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:04 compute-1 nova_compute[187157]: 2025-12-03 00:23:04.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:04 compute-1 nova_compute[187157]: 2025-12-03 00:23:04.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:04 compute-1 nova_compute[187157]: 2025-12-03 00:23:04.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:23:04 compute-1 nova_compute[187157]: 2025-12-03 00:23:04.721 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:05 compute-1 nova_compute[187157]: 2025-12-03 00:23:05.225 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:23:05 compute-1 podman[197537]: time="2025-12-03T00:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:23:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:23:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2616 "" "Go-http-client/1.1"
Dec 03 00:23:06 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:06.063 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:06 compute-1 nova_compute[187157]: 2025-12-03 00:23:06.680 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:07 compute-1 nova_compute[187157]: 2025-12-03 00:23:07.226 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:09 compute-1 nova_compute[187157]: 2025-12-03 00:23:09.192 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:09 compute-1 nova_compute[187157]: 2025-12-03 00:23:09.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:09 compute-1 nova_compute[187157]: 2025-12-03 00:23:09.779 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:13 compute-1 podman[220548]: 2025-12-03 00:23:13.224228609 +0000 UTC m=+0.063713830 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:23:13 compute-1 sshd-session[220546]: Invalid user ubuntu from 45.148.10.240 port 42032
Dec 03 00:23:13 compute-1 sshd-session[220546]: Connection closed by invalid user ubuntu 45.148.10.240 port 42032 [preauth]
Dec 03 00:23:14 compute-1 nova_compute[187157]: 2025-12-03 00:23:14.194 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:14 compute-1 nova_compute[187157]: 2025-12-03 00:23:14.779 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:15 compute-1 podman[220573]: 2025-12-03 00:23:15.215268671 +0000 UTC m=+0.055620353 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:23:15 compute-1 podman[220574]: 2025-12-03 00:23:15.23953062 +0000 UTC m=+0.080423755 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:23:18 compute-1 nova_compute[187157]: 2025-12-03 00:23:18.477 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:19 compute-1 nova_compute[187157]: 2025-12-03 00:23:19.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: ERROR   00:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: ERROR   00:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: ERROR   00:23:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: ERROR   00:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: ERROR   00:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:23:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:23:19 compute-1 nova_compute[187157]: 2025-12-03 00:23:19.780 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:22.134 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:f9:e6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '714680a21a7947948f824493a7b261e0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=45446e36-d2c9-4ea6-b9fb-83e2711350dd) old=Port_Binding(mac=['fa:16:3e:9c:f9:e6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '714680a21a7947948f824493a7b261e0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:22.135 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 45446e36-d2c9-4ea6-b9fb-83e2711350dd in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 updated
Dec 03 00:23:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:22.136 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:23:22 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:22.137 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a000b924-3c94-4621-987e-15f2dcfbeea1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:24 compute-1 nova_compute[187157]: 2025-12-03 00:23:24.198 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:24 compute-1 nova_compute[187157]: 2025-12-03 00:23:24.835 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:29 compute-1 nova_compute[187157]: 2025-12-03 00:23:29.199 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:29 compute-1 nova_compute[187157]: 2025-12-03 00:23:29.836 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:31 compute-1 podman[220619]: 2025-12-03 00:23:31.216148284 +0000 UTC m=+0.053404597 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7)
Dec 03 00:23:33 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:33.692 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:a4:ad 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=705ffc34-85ae-4eb2-b23d-c0cdb18a4c59, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=feb85889-8253-4b4d-b822-af965338aa22) old=Port_Binding(mac=['fa:16:3e:87:a4:ad'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:33 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:33.693 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port feb85889-8253-4b4d-b822-af965338aa22 in datapath cb480f63-2911-490a-aba2-8454934ba6e8 updated
Dec 03 00:23:33 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:33.694 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cb480f63-2911-490a-aba2-8454934ba6e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:23:33 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:33.694 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[25fb8f92-c447-4898-8762-951540690f98]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:34 compute-1 nova_compute[187157]: 2025-12-03 00:23:34.202 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:34 compute-1 podman[220641]: 2025-12-03 00:23:34.28050916 +0000 UTC m=+0.112918650 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:23:34 compute-1 nova_compute[187157]: 2025-12-03 00:23:34.883 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:35 compute-1 podman[197537]: time="2025-12-03T00:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:23:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:23:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Dec 03 00:23:39 compute-1 nova_compute[187157]: 2025-12-03 00:23:39.204 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:39 compute-1 nova_compute[187157]: 2025-12-03 00:23:39.884 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:42 compute-1 ovn_controller[95464]: 2025-12-03T00:23:42Z|00270|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 03 00:23:44 compute-1 podman[220661]: 2025-12-03 00:23:44.196382216 +0000 UTC m=+0.043789286 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:23:44 compute-1 nova_compute[187157]: 2025-12-03 00:23:44.205 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:44 compute-1 nova_compute[187157]: 2025-12-03 00:23:44.914 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:46 compute-1 podman[220685]: 2025-12-03 00:23:46.211994999 +0000 UTC m=+0.056990602 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:23:46 compute-1 podman[220686]: 2025-12-03 00:23:46.302286374 +0000 UTC m=+0.136482298 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:23:49 compute-1 nova_compute[187157]: 2025-12-03 00:23:49.207 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: ERROR   00:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: ERROR   00:23:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: ERROR   00:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: ERROR   00:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: ERROR   00:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:23:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:23:49 compute-1 nova_compute[187157]: 2025-12-03 00:23:49.916 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:52 compute-1 nova_compute[187157]: 2025-12-03 00:23:52.213 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:52 compute-1 nova_compute[187157]: 2025-12-03 00:23:52.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:54 compute-1 nova_compute[187157]: 2025-12-03 00:23:54.210 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:54 compute-1 nova_compute[187157]: 2025-12-03 00:23:54.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:54 compute-1 nova_compute[187157]: 2025-12-03 00:23:54.954 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:57 compute-1 sshd-session[220727]: Invalid user admin from 185.156.73.233 port 46510
Dec 03 00:23:57 compute-1 sshd-session[220727]: Connection closed by invalid user admin 185.156.73.233 port 46510 [preauth]
Dec 03 00:23:57 compute-1 nova_compute[187157]: 2025-12-03 00:23:57.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.323 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.323 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.324 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.324 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.443 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.444 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.460 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.460 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5808MB free_disk=73.16100692749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.461 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.461 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:58.960 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:58 compute-1 nova_compute[187157]: 2025-12-03 00:23:58.961 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:23:58.961 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:23:59 compute-1 nova_compute[187157]: 2025-12-03 00:23:59.211 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:59 compute-1 nova_compute[187157]: 2025-12-03 00:23:59.899 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:23:59 compute-1 nova_compute[187157]: 2025-12-03 00:23:59.900 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:23:58 up  1:30,  0 user,  load average: 0.05, 0.12, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:23:59 compute-1 nova_compute[187157]: 2025-12-03 00:23:59.952 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:23:59 compute-1 nova_compute[187157]: 2025-12-03 00:23:59.955 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:00 compute-1 nova_compute[187157]: 2025-12-03 00:24:00.035 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:24:00 compute-1 nova_compute[187157]: 2025-12-03 00:24:00.036 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:24:00 compute-1 nova_compute[187157]: 2025-12-03 00:24:00.050 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:24:00 compute-1 nova_compute[187157]: 2025-12-03 00:24:00.082 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:24:00 compute-1 nova_compute[187157]: 2025-12-03 00:24:00.099 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:24:00 compute-1 nova_compute[187157]: 2025-12-03 00:24:00.606 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:24:01 compute-1 nova_compute[187157]: 2025-12-03 00:24:01.115 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:24:01 compute-1 nova_compute[187157]: 2025-12-03 00:24:01.116 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.655s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:01.751 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:01.752 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:01.752 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:01.962 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:02 compute-1 podman[220732]: 2025-12-03 00:24:02.234416646 +0000 UTC m=+0.068012559 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:24:04 compute-1 nova_compute[187157]: 2025-12-03 00:24:04.213 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:04 compute-1 nova_compute[187157]: 2025-12-03 00:24:04.651 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:04 compute-1 nova_compute[187157]: 2025-12-03 00:24:04.652 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:04 compute-1 nova_compute[187157]: 2025-12-03 00:24:04.991 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.111 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.111 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.160 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:24:05 compute-1 podman[220753]: 2025-12-03 00:24:05.2114264 +0000 UTC m=+0.052772762 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 03 00:24:05 compute-1 podman[197537]: time="2025-12-03T00:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:24:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:24:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2616 "" "Go-http-client/1.1"
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.735 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.736 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.743 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:24:05 compute-1 nova_compute[187157]: 2025-12-03 00:24:05.743 187161 INFO nova.compute.claims [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:24:06 compute-1 nova_compute[187157]: 2025-12-03 00:24:06.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:06 compute-1 nova_compute[187157]: 2025-12-03 00:24:06.804 187161 DEBUG nova.compute.provider_tree [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:24:07 compute-1 nova_compute[187157]: 2025-12-03 00:24:07.314 187161 DEBUG nova.scheduler.client.report [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:24:07 compute-1 nova_compute[187157]: 2025-12-03 00:24:07.826 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:07 compute-1 nova_compute[187157]: 2025-12-03 00:24:07.827 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:24:08 compute-1 nova_compute[187157]: 2025-12-03 00:24:08.841 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:24:08 compute-1 nova_compute[187157]: 2025-12-03 00:24:08.841 187161 DEBUG nova.network.neutron [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:24:08 compute-1 nova_compute[187157]: 2025-12-03 00:24:08.842 187161 WARNING neutronclient.v2_0.client [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:08 compute-1 nova_compute[187157]: 2025-12-03 00:24:08.842 187161 WARNING neutronclient.v2_0.client [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:09 compute-1 nova_compute[187157]: 2025-12-03 00:24:09.215 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:09 compute-1 nova_compute[187157]: 2025-12-03 00:24:09.442 187161 INFO nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:24:09 compute-1 nova_compute[187157]: 2025-12-03 00:24:09.994 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:10 compute-1 nova_compute[187157]: 2025-12-03 00:24:10.126 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:24:10 compute-1 nova_compute[187157]: 2025-12-03 00:24:10.224 187161 DEBUG nova.network.neutron [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Successfully created port: 5bbce4a6-4771-4def-aca4-24359bf62d67 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.145 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.146 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.147 187161 INFO nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Creating image(s)
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.147 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "/var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.147 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.148 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.148 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.151 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.152 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.201 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.203 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.204 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.205 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.212 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.212 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.265 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.265 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.297 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.299 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.300 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.315 187161 DEBUG nova.network.neutron [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Successfully updated port: 5bbce4a6-4771-4def-aca4-24359bf62d67 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.353 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.354 187161 DEBUG nova.virt.disk.api [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Checking if we can resize image /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.354 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.368 187161 DEBUG nova.compute.manager [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-changed-5bbce4a6-4771-4def-aca4-24359bf62d67 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.369 187161 DEBUG nova.compute.manager [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Refreshing instance network info cache due to event network-changed-5bbce4a6-4771-4def-aca4-24359bf62d67. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.369 187161 DEBUG oslo_concurrency.lockutils [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.369 187161 DEBUG oslo_concurrency.lockutils [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.370 187161 DEBUG nova.network.neutron [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Refreshing network info cache for port 5bbce4a6-4771-4def-aca4-24359bf62d67 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.408 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.408 187161 DEBUG nova.virt.disk.api [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Cannot resize image /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.409 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.410 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Ensure instance console log exists: /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.410 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.411 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.411 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.823 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "refresh_cache-bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:24:11 compute-1 nova_compute[187157]: 2025-12-03 00:24:11.876 187161 WARNING neutronclient.v2_0.client [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:12 compute-1 nova_compute[187157]: 2025-12-03 00:24:12.048 187161 DEBUG nova.network.neutron [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:24:12 compute-1 nova_compute[187157]: 2025-12-03 00:24:12.171 187161 DEBUG nova.network.neutron [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:24:12 compute-1 nova_compute[187157]: 2025-12-03 00:24:12.679 187161 DEBUG oslo_concurrency.lockutils [req-2fa819ef-be8f-4f4f-85ac-a8a508007af9 req-120f2599-2592-4a06-8271-c72e7e07fb3b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:24:12 compute-1 nova_compute[187157]: 2025-12-03 00:24:12.680 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquired lock "refresh_cache-bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:24:12 compute-1 nova_compute[187157]: 2025-12-03 00:24:12.680 187161 DEBUG nova.network.neutron [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:24:14 compute-1 nova_compute[187157]: 2025-12-03 00:24:14.089 187161 DEBUG nova.network.neutron [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:24:14 compute-1 nova_compute[187157]: 2025-12-03 00:24:14.217 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:14 compute-1 nova_compute[187157]: 2025-12-03 00:24:14.540 187161 WARNING neutronclient.v2_0.client [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:14 compute-1 nova_compute[187157]: 2025-12-03 00:24:14.691 187161 DEBUG nova.network.neutron [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Updating instance_info_cache with network_info: [{"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.028 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:15 compute-1 podman[220788]: 2025-12-03 00:24:15.199529346 +0000 UTC m=+0.046153603 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.326 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Releasing lock "refresh_cache-bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.326 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Instance network_info: |[{"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.328 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Start _get_guest_xml network_info=[{"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.332 187161 WARNING nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.334 187161 DEBUG nova.virt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-250354013', uuid='bb3af00e-ae3a-4ad6-904d-16f5cb4e923d'), owner=OwnerMeta(userid='43c8524f2d244e8aa3019dd878dcfb81', username='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin', projectid='a8545a5c94f84697a8605fadf08251f7', projectname='tempest-TestExecuteZoneMigrationStrategy-558903593'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721455.334155) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.338 187161 DEBUG nova.virt.libvirt.host [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.339 187161 DEBUG nova.virt.libvirt.host [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.341 187161 DEBUG nova.virt.libvirt.host [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.343 187161 DEBUG nova.virt.libvirt.host [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.344 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.344 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.345 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.345 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.345 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.345 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.345 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.345 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.346 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.346 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.346 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.346 187161 DEBUG nova.virt.hardware [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.350 187161 DEBUG nova.virt.libvirt.vif [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-250354013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-250354013',id=31,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-0x0stoq2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:24:10Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=bb3af00e-ae3a-4ad6-904d-16f5cb4e923d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.350 187161 DEBUG nova.network.os_vif_util [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.351 187161 DEBUG nova.network.os_vif_util [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:24:15 compute-1 nova_compute[187157]: 2025-12-03 00:24:15.352 187161 DEBUG nova.objects.instance [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid bb3af00e-ae3a-4ad6-904d-16f5cb4e923d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.088 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <uuid>bb3af00e-ae3a-4ad6-904d-16f5cb4e923d</uuid>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <name>instance-0000001f</name>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-250354013</nova:name>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:24:15</nova:creationTime>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:24:16 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:24:16 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         <nova:port uuid="5bbce4a6-4771-4def-aca4-24359bf62d67">
Dec 03 00:24:16 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <system>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <entry name="serial">bb3af00e-ae3a-4ad6-904d-16f5cb4e923d</entry>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <entry name="uuid">bb3af00e-ae3a-4ad6-904d-16f5cb4e923d</entry>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </system>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <os>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </os>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <features>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </features>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.config"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:40:29:5f"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <target dev="tap5bbce4a6-47"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/console.log" append="off"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <video>
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </video>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:24:16 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:24:16 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:24:16 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:24:16 compute-1 nova_compute[187157]: </domain>
Dec 03 00:24:16 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.091 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Preparing to wait for external event network-vif-plugged-5bbce4a6-4771-4def-aca4-24359bf62d67 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.091 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.091 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.091 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.092 187161 DEBUG nova.virt.libvirt.vif [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-250354013',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-250354013',id=31,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-0x0stoq2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:24:10Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=bb3af00e-ae3a-4ad6-904d-16f5cb4e923d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.092 187161 DEBUG nova.network.os_vif_util [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.093 187161 DEBUG nova.network.os_vif_util [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.093 187161 DEBUG os_vif [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.094 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.094 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.094 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.095 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.095 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '45be203e-75cf-5d02-9468-8ba57126588a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.126 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.128 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.133 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.133 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bbce4a6-47, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.134 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5bbce4a6-47, col_values=(('qos', UUID('c3377e43-c797-4138-b5aa-7c7504063442')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.134 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5bbce4a6-47, col_values=(('external_ids', {'iface-id': '5bbce4a6-4771-4def-aca4-24359bf62d67', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:29:5f', 'vm-uuid': 'bb3af00e-ae3a-4ad6-904d-16f5cb4e923d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.135 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 NetworkManager[55553]: <info>  [1764721456.1372] manager: (tap5bbce4a6-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.137 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.144 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:16 compute-1 nova_compute[187157]: 2025-12-03 00:24:16.145 187161 INFO os_vif [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47')
Dec 03 00:24:17 compute-1 podman[220815]: 2025-12-03 00:24:17.288813234 +0000 UTC m=+0.116703542 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:24:17 compute-1 podman[220814]: 2025-12-03 00:24:17.522631374 +0000 UTC m=+0.362053600 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:24:19 compute-1 nova_compute[187157]: 2025-12-03 00:24:19.079 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:24:19 compute-1 nova_compute[187157]: 2025-12-03 00:24:19.080 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:24:19 compute-1 nova_compute[187157]: 2025-12-03 00:24:19.081 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No VIF found with MAC fa:16:3e:40:29:5f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:24:19 compute-1 nova_compute[187157]: 2025-12-03 00:24:19.081 187161 INFO nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Using config drive
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: ERROR   00:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: ERROR   00:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: ERROR   00:24:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: ERROR   00:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: ERROR   00:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:24:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:24:19 compute-1 nova_compute[187157]: 2025-12-03 00:24:19.592 187161 WARNING neutronclient.v2_0.client [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.029 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.183 187161 INFO nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Creating config drive at /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.config
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.188 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpwojlvrv0 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.309 187161 DEBUG oslo_concurrency.processutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpwojlvrv0" returned: 0 in 0.121s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:20 compute-1 kernel: tap5bbce4a6-47: entered promiscuous mode
Dec 03 00:24:20 compute-1 NetworkManager[55553]: <info>  [1764721460.3596] manager: (tap5bbce4a6-47): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.361 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_controller[95464]: 2025-12-03T00:24:20Z|00271|binding|INFO|Claiming lport 5bbce4a6-4771-4def-aca4-24359bf62d67 for this chassis.
Dec 03 00:24:20 compute-1 ovn_controller[95464]: 2025-12-03T00:24:20Z|00272|binding|INFO|5bbce4a6-4771-4def-aca4-24359bf62d67: Claiming fa:16:3e:40:29:5f 10.100.0.8
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.365 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.368 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.378 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:29:5f 10.100.0.8'], port_security=['fa:16:3e:40:29:5f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'bb3af00e-ae3a-4ad6-904d-16f5cb4e923d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=5bbce4a6-4771-4def-aca4-24359bf62d67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.378 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 5bbce4a6-4771-4def-aca4-24359bf62d67 in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 bound to our chassis
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.380 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.391 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[799ccfa8-3661-4a72-a2ad-1bec984ef763]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.391 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7a76663-51 in ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.393 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7a76663-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.393 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[190a319b-2002-4988-b34c-2d1de3925c71]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 systemd-machined[153454]: New machine qemu-25-instance-0000001f.
Dec 03 00:24:20 compute-1 systemd-udevd[220879]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.393 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[20d86c40-e97c-4fe1-923d-fb8bc4d9acd6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 NetworkManager[55553]: <info>  [1764721460.4035] device (tap5bbce4a6-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:24:20 compute-1 NetworkManager[55553]: <info>  [1764721460.4045] device (tap5bbce4a6-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.404 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[202e00b3-9194-419c-a8c2-8a52610ee50c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.421 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[484936a9-83ad-43f0-9427-b1c8c0725aab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 systemd[1]: Started Virtual Machine qemu-25-instance-0000001f.
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.458 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_controller[95464]: 2025-12-03T00:24:20Z|00273|binding|INFO|Setting lport 5bbce4a6-4771-4def-aca4-24359bf62d67 ovn-installed in OVS
Dec 03 00:24:20 compute-1 ovn_controller[95464]: 2025-12-03T00:24:20Z|00274|binding|INFO|Setting lport 5bbce4a6-4771-4def-aca4-24359bf62d67 up in Southbound
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.462 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.480 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[da150792-a85b-4593-ae8d-e4c9c74764d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.486 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[464dc2ab-7274-4e76-a78f-0859f47be60e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 NetworkManager[55553]: <info>  [1764721460.4879] manager: (tapf7a76663-50): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.520 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfcd13a-a4c5-4263-811a-c04561374cdb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.523 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[b88ac30f-549e-457f-bac3-d5fcd7b196f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 NetworkManager[55553]: <info>  [1764721460.5473] device (tapf7a76663-50): carrier: link connected
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.552 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[fce3ee68-70af-495f-90d7-519cea0ac032]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.569 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[601069f6-ad72-42f3-b777-16b755e85bf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548020, 'reachable_time': 32018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220911, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.588 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b218f28e-3a34-4919-831d-45fb4bf98de0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:f9e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548020, 'tstamp': 548020}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220912, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.605 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e3889923-56ba-4614-99cc-3dafbd7eabe5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548020, 'reachable_time': 32018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220913, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.637 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[315d5628-390d-4919-a295-7f5d49ecf63f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.645 187161 DEBUG nova.compute.manager [req-c06253a3-7f60-4640-bc9e-c79889766cab req-8020d7f1-b12e-4886-8539-d0fa25342690 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-vif-plugged-5bbce4a6-4771-4def-aca4-24359bf62d67 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.645 187161 DEBUG oslo_concurrency.lockutils [req-c06253a3-7f60-4640-bc9e-c79889766cab req-8020d7f1-b12e-4886-8539-d0fa25342690 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.646 187161 DEBUG oslo_concurrency.lockutils [req-c06253a3-7f60-4640-bc9e-c79889766cab req-8020d7f1-b12e-4886-8539-d0fa25342690 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.646 187161 DEBUG oslo_concurrency.lockutils [req-c06253a3-7f60-4640-bc9e-c79889766cab req-8020d7f1-b12e-4886-8539-d0fa25342690 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.646 187161 DEBUG nova.compute.manager [req-c06253a3-7f60-4640-bc9e-c79889766cab req-8020d7f1-b12e-4886-8539-d0fa25342690 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Processing event network-vif-plugged-5bbce4a6-4771-4def-aca4-24359bf62d67 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.701 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2b376491-b79c-4f9d-bae2-2802e14455db]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.702 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.703 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.703 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.704 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 kernel: tapf7a76663-50: entered promiscuous mode
Dec 03 00:24:20 compute-1 NetworkManager[55553]: <info>  [1764721460.7053] manager: (tapf7a76663-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.706 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.707 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.708 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_controller[95464]: 2025-12-03T00:24:20Z|00275|binding|INFO|Releasing lport 45446e36-d2c9-4ea6-b9fb-83e2711350dd from this chassis (sb_readonly=0)
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.718 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.720 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2e87e5-3fd0-4787-860a-a0792864b797]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.720 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.720 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.720 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.720 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.721 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[55b700a4-35dd-4eb4-a1a9-9b58dedc2311]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.721 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.721 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d161dd7d-e6ad-41d0-93a6-1a31a15773ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.722 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:24:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:20.722 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'env', 'PROCESS_TAG=haproxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.724 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.729 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.732 187161 INFO nova.virt.libvirt.driver [-] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Instance spawned successfully.
Dec 03 00:24:20 compute-1 nova_compute[187157]: 2025-12-03 00:24:20.733 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:24:21 compute-1 podman[220952]: 2025-12-03 00:24:21.080594026 +0000 UTC m=+0.048537840 container create 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Dec 03 00:24:21 compute-1 systemd[1]: Started libpod-conmon-6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b.scope.
Dec 03 00:24:21 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:24:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7f3d5990a8b8cf4694acb8a882e8b239e9cf12da059dd2d070aab8dffae3edd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.136 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:21 compute-1 podman[220952]: 2025-12-03 00:24:21.149526606 +0000 UTC m=+0.117470440 container init 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202)
Dec 03 00:24:21 compute-1 podman[220952]: 2025-12-03 00:24:21.056027654 +0000 UTC m=+0.023971488 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:24:21 compute-1 podman[220952]: 2025-12-03 00:24:21.155204622 +0000 UTC m=+0.123148436 container start 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:24:21 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [NOTICE]   (220971) : New worker (220973) forked
Dec 03 00:24:21 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [NOTICE]   (220971) : Loading success.
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.248 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.248 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.248 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.249 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.249 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.249 187161 DEBUG nova.virt.libvirt.driver [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.758 187161 INFO nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Took 10.61 seconds to spawn the instance on the hypervisor.
Dec 03 00:24:21 compute-1 nova_compute[187157]: 2025-12-03 00:24:21.758 187161 DEBUG nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.332 187161 INFO nova.compute.manager [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Took 16.66 seconds to build instance.
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.715 187161 DEBUG nova.compute.manager [req-979f9979-53f1-4709-8e4b-254a367ce135 req-d079cd13-fe88-4eea-be5a-a5ecd4df4b5c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-vif-plugged-5bbce4a6-4771-4def-aca4-24359bf62d67 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.716 187161 DEBUG oslo_concurrency.lockutils [req-979f9979-53f1-4709-8e4b-254a367ce135 req-d079cd13-fe88-4eea-be5a-a5ecd4df4b5c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.716 187161 DEBUG oslo_concurrency.lockutils [req-979f9979-53f1-4709-8e4b-254a367ce135 req-d079cd13-fe88-4eea-be5a-a5ecd4df4b5c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.716 187161 DEBUG oslo_concurrency.lockutils [req-979f9979-53f1-4709-8e4b-254a367ce135 req-d079cd13-fe88-4eea-be5a-a5ecd4df4b5c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.716 187161 DEBUG nova.compute.manager [req-979f9979-53f1-4709-8e4b-254a367ce135 req-d079cd13-fe88-4eea-be5a-a5ecd4df4b5c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] No waiting events found dispatching network-vif-plugged-5bbce4a6-4771-4def-aca4-24359bf62d67 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.717 187161 WARNING nova.compute.manager [req-979f9979-53f1-4709-8e4b-254a367ce135 req-d079cd13-fe88-4eea-be5a-a5ecd4df4b5c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received unexpected event network-vif-plugged-5bbce4a6-4771-4def-aca4-24359bf62d67 for instance with vm_state active and task_state None.
Dec 03 00:24:22 compute-1 nova_compute[187157]: 2025-12-03 00:24:22.839 187161 DEBUG oslo_concurrency.lockutils [None req-0aeafe28-4bad-4ac1-99ce-8fa5357a33c9 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.187s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:25 compute-1 nova_compute[187157]: 2025-12-03 00:24:25.083 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:26 compute-1 nova_compute[187157]: 2025-12-03 00:24:26.174 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:30 compute-1 nova_compute[187157]: 2025-12-03 00:24:30.086 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:31 compute-1 nova_compute[187157]: 2025-12-03 00:24:31.176 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:32 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:24:32 compute-1 podman[220999]: 2025-12-03 00:24:32.58364815 +0000 UTC m=+0.094287562 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Dec 03 00:24:32 compute-1 ovn_controller[95464]: 2025-12-03T00:24:32Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:29:5f 10.100.0.8
Dec 03 00:24:32 compute-1 ovn_controller[95464]: 2025-12-03T00:24:32Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:29:5f 10.100.0.8
Dec 03 00:24:34 compute-1 nova_compute[187157]: 2025-12-03 00:24:34.434 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Creating tmpfile /var/lib/nova/instances/tmpve2lxt8_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:24:34 compute-1 nova_compute[187157]: 2025-12-03 00:24:34.435 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:34 compute-1 nova_compute[187157]: 2025-12-03 00:24:34.448 187161 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:24:35 compute-1 nova_compute[187157]: 2025-12-03 00:24:35.127 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:35 compute-1 podman[221021]: 2025-12-03 00:24:35.470218536 +0000 UTC m=+0.063685734 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:24:35 compute-1 podman[197537]: time="2025-12-03T00:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:24:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:24:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3081 "" "Go-http-client/1.1"
Dec 03 00:24:36 compute-1 nova_compute[187157]: 2025-12-03 00:24:36.222 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:36 compute-1 nova_compute[187157]: 2025-12-03 00:24:36.489 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:39 compute-1 sshd-session[221041]: Invalid user solv from 193.32.162.146 port 47096
Dec 03 00:24:39 compute-1 sshd-session[221041]: Connection closed by invalid user solv 193.32.162.146 port 47096 [preauth]
Dec 03 00:24:40 compute-1 nova_compute[187157]: 2025-12-03 00:24:40.129 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:41 compute-1 nova_compute[187157]: 2025-12-03 00:24:41.142 187161 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:24:41 compute-1 nova_compute[187157]: 2025-12-03 00:24:41.258 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:42 compute-1 nova_compute[187157]: 2025-12-03 00:24:42.160 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:24:42 compute-1 nova_compute[187157]: 2025-12-03 00:24:42.160 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:24:42 compute-1 nova_compute[187157]: 2025-12-03 00:24:42.161 187161 DEBUG nova.network.neutron [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:24:42 compute-1 nova_compute[187157]: 2025-12-03 00:24:42.670 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:43 compute-1 nova_compute[187157]: 2025-12-03 00:24:43.563 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:43 compute-1 nova_compute[187157]: 2025-12-03 00:24:43.771 187161 DEBUG nova.network.neutron [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.171 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.437 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.453 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.453 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Creating instance directory: /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.453 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Creating disk.info with the contents: {'/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk': 'qcow2', '/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.454 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.454 187161 DEBUG nova.objects.instance [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.964 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.969 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:45 compute-1 nova_compute[187157]: 2025-12-03 00:24:45.971 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.054 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.055 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.055 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.055 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.058 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.058 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.109 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.110 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:46 compute-1 podman[221050]: 2025-12-03 00:24:46.242522734 +0000 UTC m=+0.067292351 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.304 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.512 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk 1073741824" returned: 0 in 0.403s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.513 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.458s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.514 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.560 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.561 187161 DEBUG nova.virt.disk.api [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.561 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.614 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.615 187161 DEBUG nova.virt.disk.api [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:24:46 compute-1 nova_compute[187157]: 2025-12-03 00:24:46.616 187161 DEBUG nova.objects.instance [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.169 187161 DEBUG nova.objects.base [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<89b22e0d-2f57-40f3-8c02-38af8f0ac9ab> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.170 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.195 187161 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config 497664" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.196 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.198 187161 DEBUG nova.virt.libvirt.vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-802250903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-802250903',id=30,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:23:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-s8j6lm3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:24:00Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.199 187161 DEBUG nova.network.os_vif_util [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.199 187161 DEBUG nova.network.os_vif_util [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.200 187161 DEBUG os_vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.201 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.201 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.201 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.202 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.202 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '46e46b79-bfc8-5b87-8e55-d131c9e691f8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.204 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.205 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.208 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.208 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8c14b2b-88, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.208 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd8c14b2b-88, col_values=(('qos', UUID('5ad517b6-e57c-4364-b9b4-efaa15ca138a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.209 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd8c14b2b-88, col_values=(('external_ids', {'iface-id': 'd8c14b2b-88f1-46e9-af74-d11479fced60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:0e:e1', 'vm-uuid': '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.210 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 NetworkManager[55553]: <info>  [1764721487.2114] manager: (tapd8c14b2b-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.212 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.217 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.217 187161 INFO os_vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88')
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.218 187161 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.218 187161 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.219 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:47 compute-1 nova_compute[187157]: 2025-12-03 00:24:47.659 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:48 compute-1 nova_compute[187157]: 2025-12-03 00:24:48.000 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:48.002 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:24:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:48.002 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:24:48 compute-1 podman[221088]: 2025-12-03 00:24:48.231930927 +0000 UTC m=+0.056928832 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 03 00:24:48 compute-1 podman[221089]: 2025-12-03 00:24:48.259475331 +0000 UTC m=+0.086367481 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Dec 03 00:24:48 compute-1 nova_compute[187157]: 2025-12-03 00:24:48.991 187161 DEBUG nova.network.neutron [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Port d8c14b2b-88f1-46e9-af74-d11479fced60 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:24:49 compute-1 nova_compute[187157]: 2025-12-03 00:24:49.005 187161 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: ERROR   00:24:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: ERROR   00:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: ERROR   00:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: ERROR   00:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: ERROR   00:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:24:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:24:50 compute-1 nova_compute[187157]: 2025-12-03 00:24:50.173 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:50 compute-1 ovn_controller[95464]: 2025-12-03T00:24:50Z|00276|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:24:51 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:24:51 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:24:51 compute-1 kernel: tapd8c14b2b-88: entered promiscuous mode
Dec 03 00:24:51 compute-1 NetworkManager[55553]: <info>  [1764721491.8592] manager: (tapd8c14b2b-88): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Dec 03 00:24:51 compute-1 ovn_controller[95464]: 2025-12-03T00:24:51Z|00277|binding|INFO|Claiming lport d8c14b2b-88f1-46e9-af74-d11479fced60 for this additional chassis.
Dec 03 00:24:51 compute-1 ovn_controller[95464]: 2025-12-03T00:24:51Z|00278|binding|INFO|d8c14b2b-88f1-46e9-af74-d11479fced60: Claiming fa:16:3e:ec:0e:e1 10.100.0.3
Dec 03 00:24:51 compute-1 nova_compute[187157]: 2025-12-03 00:24:51.859 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.889 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:0e:e1 10.100.0.3'], port_security=['fa:16:3e:ec:0e:e1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '10', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d8c14b2b-88f1-46e9-af74-d11479fced60) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:24:51 compute-1 ovn_controller[95464]: 2025-12-03T00:24:51Z|00279|binding|INFO|Setting lport d8c14b2b-88f1-46e9-af74-d11479fced60 ovn-installed in OVS
Dec 03 00:24:51 compute-1 nova_compute[187157]: 2025-12-03 00:24:51.890 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.891 104348 INFO neutron.agent.ovn.metadata.agent [-] Port d8c14b2b-88f1-46e9-af74-d11479fced60 in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:24:51 compute-1 nova_compute[187157]: 2025-12-03 00:24:51.892 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.893 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:24:51 compute-1 systemd-udevd[221164]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:24:51 compute-1 nova_compute[187157]: 2025-12-03 00:24:51.897 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:51 compute-1 systemd-machined[153454]: New machine qemu-26-instance-0000001e.
Dec 03 00:24:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.915 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf6d172-b71b-4642-a4b0-34b885c47b5b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:51 compute-1 NetworkManager[55553]: <info>  [1764721491.9202] device (tapd8c14b2b-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:24:51 compute-1 NetworkManager[55553]: <info>  [1764721491.9215] device (tapd8c14b2b-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:24:51 compute-1 systemd[1]: Started Virtual Machine qemu-26-instance-0000001e.
Dec 03 00:24:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.953 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[f50cdedd-230a-4f5f-84c1-a3307ac9ff29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:51 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.958 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc2418c-95db-4b6b-93f7-916f221d18c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:51.998 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[f893ec30-9961-4509-8bf8-b94af2970391]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.021 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3c32aa67-3497-448f-8705-dd970b3aa22e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548020, 'reachable_time': 32018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221181, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.044 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8f951d-497a-4a14-9a04-8f49456cd4e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548031, 'tstamp': 548031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221182, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548034, 'tstamp': 548034}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221182, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.046 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:52 compute-1 nova_compute[187157]: 2025-12-03 00:24:52.087 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:52 compute-1 nova_compute[187157]: 2025-12-03 00:24:52.088 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.089 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.089 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.089 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.089 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:24:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:52.091 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[14d21b5d-85ab-462b-8202-a7722a83200b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:52 compute-1 nova_compute[187157]: 2025-12-03 00:24:52.210 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:52 compute-1 nova_compute[187157]: 2025-12-03 00:24:52.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:52 compute-1 nova_compute[187157]: 2025-12-03 00:24:52.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:54 compute-1 ovn_controller[95464]: 2025-12-03T00:24:54Z|00280|binding|INFO|Claiming lport d8c14b2b-88f1-46e9-af74-d11479fced60 for this chassis.
Dec 03 00:24:54 compute-1 ovn_controller[95464]: 2025-12-03T00:24:54Z|00281|binding|INFO|d8c14b2b-88f1-46e9-af74-d11479fced60: Claiming fa:16:3e:ec:0e:e1 10.100.0.3
Dec 03 00:24:54 compute-1 ovn_controller[95464]: 2025-12-03T00:24:54Z|00282|binding|INFO|Setting lport d8c14b2b-88f1-46e9-af74-d11479fced60 up in Southbound
Dec 03 00:24:54 compute-1 nova_compute[187157]: 2025-12-03 00:24:54.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:55 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:24:55.003 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:55 compute-1 nova_compute[187157]: 2025-12-03 00:24:55.176 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:55 compute-1 nova_compute[187157]: 2025-12-03 00:24:55.669 187161 INFO nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Post operation of migration started
Dec 03 00:24:55 compute-1 nova_compute[187157]: 2025-12-03 00:24:55.670 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:56 compute-1 nova_compute[187157]: 2025-12-03 00:24:56.073 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:56 compute-1 nova_compute[187157]: 2025-12-03 00:24:56.074 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:56 compute-1 nova_compute[187157]: 2025-12-03 00:24:56.275 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:24:56 compute-1 nova_compute[187157]: 2025-12-03 00:24:56.276 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:24:56 compute-1 nova_compute[187157]: 2025-12-03 00:24:56.277 187161 DEBUG nova.network.neutron [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:24:56 compute-1 nova_compute[187157]: 2025-12-03 00:24:56.786 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:57 compute-1 nova_compute[187157]: 2025-12-03 00:24:57.212 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:57 compute-1 nova_compute[187157]: 2025-12-03 00:24:57.462 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:57 compute-1 nova_compute[187157]: 2025-12-03 00:24:57.611 187161 DEBUG nova.network.neutron [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:24:58 compute-1 nova_compute[187157]: 2025-12-03 00:24:58.117 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:24:58 compute-1 nova_compute[187157]: 2025-12-03 00:24:58.643 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:58 compute-1 nova_compute[187157]: 2025-12-03 00:24:58.644 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:58 compute-1 nova_compute[187157]: 2025-12-03 00:24:58.645 187161 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:58 compute-1 nova_compute[187157]: 2025-12-03 00:24:58.651 187161 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:24:58 compute-1 virtqemud[186882]: Domain id=26 name='instance-0000001e' uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab is tainted: custom-monitor
Dec 03 00:24:58 compute-1 nova_compute[187157]: 2025-12-03 00:24:58.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:59 compute-1 nova_compute[187157]: 2025-12-03 00:24:59.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:59 compute-1 nova_compute[187157]: 2025-12-03 00:24:59.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:59 compute-1 nova_compute[187157]: 2025-12-03 00:24:59.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:59 compute-1 nova_compute[187157]: 2025-12-03 00:24:59.213 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:24:59 compute-1 nova_compute[187157]: 2025-12-03 00:24:59.658 187161 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.179 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.274 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.342 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.343 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.401 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.408 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.464 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.465 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.529 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.664 187161 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.668 187161 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.722 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.723 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.740 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.741 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5502MB free_disk=73.1026725769043GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.741 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:00 compute-1 nova_compute[187157]: 2025-12-03 00:25:00.742 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:01 compute-1 nova_compute[187157]: 2025-12-03 00:25:01.178 187161 DEBUG nova.objects.instance [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:25:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:01.752 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:01.753 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:01.753 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:01 compute-1 nova_compute[187157]: 2025-12-03 00:25:01.762 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Migration for instance 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.198 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.213 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.272 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.295 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance bb3af00e-ae3a-4ad6-904d-16f5cb4e923d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.309 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.309 187161 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.803 187161 WARNING nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.804 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.804 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:25:00 up  1:32,  0 user,  load average: 0.35, 0.20, 0.25\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_a8545a5c94f84697a8605fadf08251f7': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:25:02 compute-1 nova_compute[187157]: 2025-12-03 00:25:02.873 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:25:03 compute-1 podman[221216]: 2025-12-03 00:25:03.27701376 +0000 UTC m=+0.102819956 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter)
Dec 03 00:25:03 compute-1 nova_compute[187157]: 2025-12-03 00:25:03.382 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:25:03 compute-1 nova_compute[187157]: 2025-12-03 00:25:03.893 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:25:03 compute-1 nova_compute[187157]: 2025-12-03 00:25:03.894 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.152s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:05 compute-1 nova_compute[187157]: 2025-12-03 00:25:05.182 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:05 compute-1 podman[197537]: time="2025-12-03T00:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:25:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:25:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3077 "" "Go-http-client/1.1"
Dec 03 00:25:06 compute-1 podman[221238]: 2025-12-03 00:25:06.236359019 +0000 UTC m=+0.068701625 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.584 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.584 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.585 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.585 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.585 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.608 187161 INFO nova.compute.manager [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Terminating instance
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.890 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.891 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.891 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:06 compute-1 nova_compute[187157]: 2025-12-03 00:25:06.891 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.124 187161 DEBUG nova.compute.manager [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:25:07 compute-1 kernel: tap5bbce4a6-47 (unregistering): left promiscuous mode
Dec 03 00:25:07 compute-1 NetworkManager[55553]: <info>  [1764721507.1585] device (tap5bbce4a6-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.173 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 ovn_controller[95464]: 2025-12-03T00:25:07Z|00283|binding|INFO|Releasing lport 5bbce4a6-4771-4def-aca4-24359bf62d67 from this chassis (sb_readonly=0)
Dec 03 00:25:07 compute-1 ovn_controller[95464]: 2025-12-03T00:25:07Z|00284|binding|INFO|Setting lport 5bbce4a6-4771-4def-aca4-24359bf62d67 down in Southbound
Dec 03 00:25:07 compute-1 ovn_controller[95464]: 2025-12-03T00:25:07Z|00285|binding|INFO|Removing iface tap5bbce4a6-47 ovn-installed in OVS
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.176 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.184 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:29:5f 10.100.0.8'], port_security=['fa:16:3e:40:29:5f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'bb3af00e-ae3a-4ad6-904d-16f5cb4e923d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '5', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=5bbce4a6-4771-4def-aca4-24359bf62d67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.185 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 5bbce4a6-4771-4def-aca4-24359bf62d67 in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.186 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.201 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.205 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[365bc765-505b-4541-aa0f-cfd77fb8fa35]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.214 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Dec 03 00:25:07 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000001f.scope: Consumed 13.258s CPU time.
Dec 03 00:25:07 compute-1 systemd-machined[153454]: Machine qemu-25-instance-0000001f terminated.
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.237 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[021468e1-6a1a-4a57-9bba-8e0881add832]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.240 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae313c3-3409-41df-a421-a28870835e94]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.264 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[32e2f5b5-34aa-4a8a-b8fe-273a82610f18]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.278 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4c2b2e-b386-4eaa-9de6-8c523941c3f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548020, 'reachable_time': 32018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221270, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.300 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[15d96753-f172-4c1c-bf3c-2b7ef7753e3c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548031, 'tstamp': 548031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221271, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548034, 'tstamp': 548034}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221271, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.301 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.308 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.313 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.314 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.315 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.315 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.316 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:25:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:07.317 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7210e5-45d8-49b3-bbea-1446a34b7b6e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.356 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.360 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.403 187161 INFO nova.virt.libvirt.driver [-] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Instance destroyed successfully.
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.404 187161 DEBUG nova.objects.instance [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'resources' on Instance uuid bb3af00e-ae3a-4ad6-904d-16f5cb4e923d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.914 187161 DEBUG nova.virt.libvirt.vif [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-250354013',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-250354013',id=31,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:24:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-0x0stoq2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:24:21Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=bb3af00e-ae3a-4ad6-904d-16f5cb4e923d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.915 187161 DEBUG nova.network.os_vif_util [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "5bbce4a6-4771-4def-aca4-24359bf62d67", "address": "fa:16:3e:40:29:5f", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bbce4a6-47", "ovs_interfaceid": "5bbce4a6-4771-4def-aca4-24359bf62d67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.916 187161 DEBUG nova.network.os_vif_util [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.917 187161 DEBUG os_vif [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.919 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.920 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bbce4a6-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.924 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.925 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.925 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c3377e43-c797-4138-b5aa-7c7504063442) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.926 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.927 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.929 187161 INFO os_vif [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:29:5f,bridge_name='br-int',has_traffic_filtering=True,id=5bbce4a6-4771-4def-aca4-24359bf62d67,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bbce4a6-47')
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.930 187161 INFO nova.virt.libvirt.driver [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Deleting instance files /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d_del
Dec 03 00:25:07 compute-1 nova_compute[187157]: 2025-12-03 00:25:07.930 187161 INFO nova.virt.libvirt.driver [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Deletion of /var/lib/nova/instances/bb3af00e-ae3a-4ad6-904d-16f5cb4e923d_del complete
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.157 187161 DEBUG nova.compute.manager [req-e61dc2aa-9ae4-4850-9fba-95ca6f99a5e3 req-325b8b8a-1c00-4851-8ff7-53208fbe966c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-vif-unplugged-5bbce4a6-4771-4def-aca4-24359bf62d67 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.157 187161 DEBUG oslo_concurrency.lockutils [req-e61dc2aa-9ae4-4850-9fba-95ca6f99a5e3 req-325b8b8a-1c00-4851-8ff7-53208fbe966c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.158 187161 DEBUG oslo_concurrency.lockutils [req-e61dc2aa-9ae4-4850-9fba-95ca6f99a5e3 req-325b8b8a-1c00-4851-8ff7-53208fbe966c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.158 187161 DEBUG oslo_concurrency.lockutils [req-e61dc2aa-9ae4-4850-9fba-95ca6f99a5e3 req-325b8b8a-1c00-4851-8ff7-53208fbe966c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.158 187161 DEBUG nova.compute.manager [req-e61dc2aa-9ae4-4850-9fba-95ca6f99a5e3 req-325b8b8a-1c00-4851-8ff7-53208fbe966c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] No waiting events found dispatching network-vif-unplugged-5bbce4a6-4771-4def-aca4-24359bf62d67 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.158 187161 DEBUG nova.compute.manager [req-e61dc2aa-9ae4-4850-9fba-95ca6f99a5e3 req-325b8b8a-1c00-4851-8ff7-53208fbe966c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-vif-unplugged-5bbce4a6-4771-4def-aca4-24359bf62d67 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.448 187161 INFO nova.compute.manager [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Took 1.32 seconds to destroy the instance on the hypervisor.
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.449 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.449 187161 DEBUG nova.compute.manager [-] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.449 187161 DEBUG nova.network.neutron [-] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.450 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:08 compute-1 nova_compute[187157]: 2025-12-03 00:25:08.799 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:09 compute-1 nova_compute[187157]: 2025-12-03 00:25:09.178 187161 DEBUG nova.compute.manager [req-e8da6459-139a-40ab-bc52-558d45dfd1a0 req-abc088e0-2689-458a-8804-df33a0f5e333 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-vif-deleted-5bbce4a6-4771-4def-aca4-24359bf62d67 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:09 compute-1 nova_compute[187157]: 2025-12-03 00:25:09.179 187161 INFO nova.compute.manager [req-e8da6459-139a-40ab-bc52-558d45dfd1a0 req-abc088e0-2689-458a-8804-df33a0f5e333 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Neutron deleted interface 5bbce4a6-4771-4def-aca4-24359bf62d67; detaching it from the instance and deleting it from the info cache
Dec 03 00:25:09 compute-1 nova_compute[187157]: 2025-12-03 00:25:09.179 187161 DEBUG nova.network.neutron [req-e8da6459-139a-40ab-bc52-558d45dfd1a0 req-abc088e0-2689-458a-8804-df33a0f5e333 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:09 compute-1 nova_compute[187157]: 2025-12-03 00:25:09.606 187161 DEBUG nova.network.neutron [-] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:09 compute-1 nova_compute[187157]: 2025-12-03 00:25:09.685 187161 DEBUG nova.compute.manager [req-e8da6459-139a-40ab-bc52-558d45dfd1a0 req-abc088e0-2689-458a-8804-df33a0f5e333 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Detach interface failed, port_id=5bbce4a6-4771-4def-aca4-24359bf62d67, reason: Instance bb3af00e-ae3a-4ad6-904d-16f5cb4e923d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.121 187161 INFO nova.compute.manager [-] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Took 1.67 seconds to deallocate network for instance.
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.184 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.234 187161 DEBUG nova.compute.manager [req-ed12235e-b5df-4032-9345-6898d2d52571 req-b79cf0f3-8e13-4e65-b0ae-29032d4e184a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received event network-vif-unplugged-5bbce4a6-4771-4def-aca4-24359bf62d67 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.235 187161 DEBUG oslo_concurrency.lockutils [req-ed12235e-b5df-4032-9345-6898d2d52571 req-b79cf0f3-8e13-4e65-b0ae-29032d4e184a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.235 187161 DEBUG oslo_concurrency.lockutils [req-ed12235e-b5df-4032-9345-6898d2d52571 req-b79cf0f3-8e13-4e65-b0ae-29032d4e184a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.236 187161 DEBUG oslo_concurrency.lockutils [req-ed12235e-b5df-4032-9345-6898d2d52571 req-b79cf0f3-8e13-4e65-b0ae-29032d4e184a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.236 187161 DEBUG nova.compute.manager [req-ed12235e-b5df-4032-9345-6898d2d52571 req-b79cf0f3-8e13-4e65-b0ae-29032d4e184a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] No waiting events found dispatching network-vif-unplugged-5bbce4a6-4771-4def-aca4-24359bf62d67 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.236 187161 WARNING nova.compute.manager [req-ed12235e-b5df-4032-9345-6898d2d52571 req-b79cf0f3-8e13-4e65-b0ae-29032d4e184a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bb3af00e-ae3a-4ad6-904d-16f5cb4e923d] Received unexpected event network-vif-unplugged-5bbce4a6-4771-4def-aca4-24359bf62d67 for instance with vm_state deleted and task_state None.
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.638 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.640 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:10 compute-1 nova_compute[187157]: 2025-12-03 00:25:10.711 187161 DEBUG nova.compute.provider_tree [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:25:10 compute-1 sshd-session[221289]: Invalid user sol from 45.148.10.240 port 37994
Dec 03 00:25:10 compute-1 sshd-session[221289]: Connection closed by invalid user sol 45.148.10.240 port 37994 [preauth]
Dec 03 00:25:11 compute-1 nova_compute[187157]: 2025-12-03 00:25:11.224 187161 DEBUG nova.scheduler.client.report [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:25:11 compute-1 nova_compute[187157]: 2025-12-03 00:25:11.733 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:11 compute-1 nova_compute[187157]: 2025-12-03 00:25:11.751 187161 INFO nova.scheduler.client.report [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Deleted allocations for instance bb3af00e-ae3a-4ad6-904d-16f5cb4e923d
Dec 03 00:25:12 compute-1 nova_compute[187157]: 2025-12-03 00:25:12.783 187161 DEBUG oslo_concurrency.lockutils [None req-bda4bb94-5882-46d6-ae60-d2f98fd10ff0 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "bb3af00e-ae3a-4ad6-904d-16f5cb4e923d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.199s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:12 compute-1 nova_compute[187157]: 2025-12-03 00:25:12.927 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:13 compute-1 nova_compute[187157]: 2025-12-03 00:25:13.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.109 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.110 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.110 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.110 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.110 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.130 187161 INFO nova.compute.manager [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Terminating instance
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.650 187161 DEBUG nova.compute.manager [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:25:14 compute-1 kernel: tapd8c14b2b-88 (unregistering): left promiscuous mode
Dec 03 00:25:14 compute-1 NetworkManager[55553]: <info>  [1764721514.6761] device (tapd8c14b2b-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:25:14 compute-1 ovn_controller[95464]: 2025-12-03T00:25:14Z|00286|binding|INFO|Releasing lport d8c14b2b-88f1-46e9-af74-d11479fced60 from this chassis (sb_readonly=0)
Dec 03 00:25:14 compute-1 ovn_controller[95464]: 2025-12-03T00:25:14Z|00287|binding|INFO|Setting lport d8c14b2b-88f1-46e9-af74-d11479fced60 down in Southbound
Dec 03 00:25:14 compute-1 ovn_controller[95464]: 2025-12-03T00:25:14Z|00288|binding|INFO|Removing iface tapd8c14b2b-88 ovn-installed in OVS
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.706 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.711 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:0e:e1 10.100.0.3'], port_security=['fa:16:3e:ec:0e:e1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '15', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=d8c14b2b-88f1-46e9-af74-d11479fced60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.712 104348 INFO neutron.agent.ovn.metadata.agent [-] Port d8c14b2b-88f1-46e9-af74-d11479fced60 in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.714 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.715 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8b71b1-8839-4855-9727-e0d3281a791d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.715 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace which is not needed anymore
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.722 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:14 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Dec 03 00:25:14 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000001e.scope: Consumed 2.531s CPU time.
Dec 03 00:25:14 compute-1 systemd-machined[153454]: Machine qemu-26-instance-0000001e terminated.
Dec 03 00:25:14 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [NOTICE]   (220971) : haproxy version is 3.0.5-8e879a5
Dec 03 00:25:14 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [NOTICE]   (220971) : path to executable is /usr/sbin/haproxy
Dec 03 00:25:14 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [WARNING]  (220971) : Exiting Master process...
Dec 03 00:25:14 compute-1 podman[221316]: 2025-12-03 00:25:14.83508516 +0000 UTC m=+0.032184159 container kill 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:25:14 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [ALERT]    (220971) : Current worker (220973) exited with code 143 (Terminated)
Dec 03 00:25:14 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[220967]: [WARNING]  (220971) : All workers exited. Exiting... (0)
Dec 03 00:25:14 compute-1 systemd[1]: libpod-6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b.scope: Deactivated successfully.
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.875 187161 DEBUG nova.compute.manager [req-dd5e063d-b24c-4901-a963-3dbcd51b9e09 req-3b2dd2b9-6ee3-4b4a-b4aa-0a097647e5ab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.875 187161 DEBUG oslo_concurrency.lockutils [req-dd5e063d-b24c-4901-a963-3dbcd51b9e09 req-3b2dd2b9-6ee3-4b4a-b4aa-0a097647e5ab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.876 187161 DEBUG oslo_concurrency.lockutils [req-dd5e063d-b24c-4901-a963-3dbcd51b9e09 req-3b2dd2b9-6ee3-4b4a-b4aa-0a097647e5ab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.876 187161 DEBUG oslo_concurrency.lockutils [req-dd5e063d-b24c-4901-a963-3dbcd51b9e09 req-3b2dd2b9-6ee3-4b4a-b4aa-0a097647e5ab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.876 187161 DEBUG nova.compute.manager [req-dd5e063d-b24c-4901-a963-3dbcd51b9e09 req-3b2dd2b9-6ee3-4b4a-b4aa-0a097647e5ab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.876 187161 DEBUG nova.compute.manager [req-dd5e063d-b24c-4901-a963-3dbcd51b9e09 req-3b2dd2b9-6ee3-4b4a-b4aa-0a097647e5ab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:25:14 compute-1 podman[221333]: 2025-12-03 00:25:14.880988639 +0000 UTC m=+0.025520587 container died 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.903 187161 INFO nova.virt.libvirt.driver [-] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance destroyed successfully.
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.904 187161 DEBUG nova.objects.instance [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'resources' on Instance uuid 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:25:14 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b-userdata-shm.mount: Deactivated successfully.
Dec 03 00:25:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-e7f3d5990a8b8cf4694acb8a882e8b239e9cf12da059dd2d070aab8dffae3edd-merged.mount: Deactivated successfully.
Dec 03 00:25:14 compute-1 podman[221333]: 2025-12-03 00:25:14.914567941 +0000 UTC m=+0.059099889 container cleanup 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:25:14 compute-1 systemd[1]: libpod-conmon-6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b.scope: Deactivated successfully.
Dec 03 00:25:14 compute-1 podman[221334]: 2025-12-03 00:25:14.930613288 +0000 UTC m=+0.074481301 container remove 6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.935 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[437013cb-00c6-4e75-8cf1-04e6df9d51b4]: (4, ("Wed Dec  3 12:25:14 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b)\n6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b\nWed Dec  3 12:25:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b)\n6bcd0124ec7d77f18d87937d489a1d9ea3f828e677ecd9f12d0d4807bfcee88b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.936 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee9828a-b5ec-417e-b3fb-a7572bc8ed76]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.936 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.936 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[1f04aab2-4074-468a-8a07-54efd75ee448]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.937 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.979 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:14 compute-1 kernel: tapf7a76663-50: left promiscuous mode
Dec 03 00:25:14 compute-1 nova_compute[187157]: 2025-12-03 00:25:14.993 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:14 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:14.995 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9488ae28-4e6a-4af4-8ef5-3369d00e91b5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:15 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:15.022 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb9054e-9814-4d02-a250-0c69cb1681d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:15 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:15.023 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[eea6f982-8bb4-49f3-a87c-39841bdec339]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:15 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:15.039 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[48d495e3-385d-4680-b642-a7f6f97789d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548013, 'reachable_time': 19045, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221382, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:15 compute-1 systemd[1]: run-netns-ovnmeta\x2df7a76663\x2d52a3\x2d4e8c\x2daf8a\x2d8ef26c8fecf2.mount: Deactivated successfully.
Dec 03 00:25:15 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:15.042 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:25:15 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:15.043 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[99c4b3a6-1194-4e39-8d2b-897bbdf31e15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.185 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.412 187161 DEBUG nova.virt.libvirt.vif [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-802250903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-802250903',id=30,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:23:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-s8j6lm3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:25:01Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.412 187161 DEBUG nova.network.os_vif_util [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.413 187161 DEBUG nova.network.os_vif_util [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.414 187161 DEBUG os_vif [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.415 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.415 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8c14b2b-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.416 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.418 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.419 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.420 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5ad517b6-e57c-4364-b9b4-efaa15ca138a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.420 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.421 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.423 187161 INFO os_vif [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88')
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.423 187161 INFO nova.virt.libvirt.driver [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Deleting instance files /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab_del
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.424 187161 INFO nova.virt.libvirt.driver [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Deletion of /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab_del complete
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.937 187161 INFO nova.compute.manager [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.937 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.938 187161 DEBUG nova.compute.manager [-] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.938 187161 DEBUG nova.network.neutron [-] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:25:15 compute-1 nova_compute[187157]: 2025-12-03 00:25:15.939 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:16 compute-1 nova_compute[187157]: 2025-12-03 00:25:16.086 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:16 compute-1 nova_compute[187157]: 2025-12-03 00:25:16.387 187161 DEBUG nova.compute.manager [req-2e64f916-f9f2-4e68-a13d-3ec4bce8b7d6 req-c5591d37-ca2c-426b-9764-31c282401e47 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-deleted-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:16 compute-1 nova_compute[187157]: 2025-12-03 00:25:16.387 187161 INFO nova.compute.manager [req-2e64f916-f9f2-4e68-a13d-3ec4bce8b7d6 req-c5591d37-ca2c-426b-9764-31c282401e47 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Neutron deleted interface d8c14b2b-88f1-46e9-af74-d11479fced60; detaching it from the instance and deleting it from the info cache
Dec 03 00:25:16 compute-1 nova_compute[187157]: 2025-12-03 00:25:16.388 187161 DEBUG nova.network.neutron [req-2e64f916-f9f2-4e68-a13d-3ec4bce8b7d6 req-c5591d37-ca2c-426b-9764-31c282401e47 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:16 compute-1 nova_compute[187157]: 2025-12-03 00:25:16.849 187161 DEBUG nova.network.neutron [-] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:16 compute-1 nova_compute[187157]: 2025-12-03 00:25:16.899 187161 DEBUG nova.compute.manager [req-2e64f916-f9f2-4e68-a13d-3ec4bce8b7d6 req-c5591d37-ca2c-426b-9764-31c282401e47 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Detach interface failed, port_id=d8c14b2b-88f1-46e9-af74-d11479fced60, reason: Instance 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.015 187161 DEBUG nova.compute.manager [req-dc3a7ed7-0928-4177-bc13-809e551c4cd4 req-c51cf223-e9d0-470e-8848-17f1b009a6a2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.016 187161 DEBUG oslo_concurrency.lockutils [req-dc3a7ed7-0928-4177-bc13-809e551c4cd4 req-c51cf223-e9d0-470e-8848-17f1b009a6a2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.016 187161 DEBUG oslo_concurrency.lockutils [req-dc3a7ed7-0928-4177-bc13-809e551c4cd4 req-c51cf223-e9d0-470e-8848-17f1b009a6a2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.016 187161 DEBUG oslo_concurrency.lockutils [req-dc3a7ed7-0928-4177-bc13-809e551c4cd4 req-c51cf223-e9d0-470e-8848-17f1b009a6a2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.017 187161 DEBUG nova.compute.manager [req-dc3a7ed7-0928-4177-bc13-809e551c4cd4 req-c51cf223-e9d0-470e-8848-17f1b009a6a2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.017 187161 DEBUG nova.compute.manager [req-dc3a7ed7-0928-4177-bc13-809e551c4cd4 req-c51cf223-e9d0-470e-8848-17f1b009a6a2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:25:17 compute-1 podman[221383]: 2025-12-03 00:25:17.220937905 +0000 UTC m=+0.053326639 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:25:17 compute-1 nova_compute[187157]: 2025-12-03 00:25:17.460 187161 INFO nova.compute.manager [-] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Took 1.52 seconds to deallocate network for instance.
Dec 03 00:25:18 compute-1 nova_compute[187157]: 2025-12-03 00:25:18.016 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:18 compute-1 nova_compute[187157]: 2025-12-03 00:25:18.017 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:18 compute-1 nova_compute[187157]: 2025-12-03 00:25:18.022 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:18 compute-1 nova_compute[187157]: 2025-12-03 00:25:18.066 187161 INFO nova.scheduler.client.report [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Deleted allocations for instance 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab
Dec 03 00:25:19 compute-1 nova_compute[187157]: 2025-12-03 00:25:19.147 187161 DEBUG oslo_concurrency.lockutils [None req-07d10331-f1a8-4dfb-ba25-36254c8d170d 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.038s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:19 compute-1 podman[221407]: 2025-12-03 00:25:19.230355954 +0000 UTC m=+0.067382089 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:25:19 compute-1 podman[221408]: 2025-12-03 00:25:19.258559986 +0000 UTC m=+0.098018619 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: ERROR   00:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: ERROR   00:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: ERROR   00:25:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: ERROR   00:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: ERROR   00:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:25:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:25:20 compute-1 nova_compute[187157]: 2025-12-03 00:25:20.187 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:20 compute-1 nova_compute[187157]: 2025-12-03 00:25:20.421 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:25 compute-1 nova_compute[187157]: 2025-12-03 00:25:25.190 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:25 compute-1 nova_compute[187157]: 2025-12-03 00:25:25.422 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:30 compute-1 nova_compute[187157]: 2025-12-03 00:25:30.225 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:30 compute-1 nova_compute[187157]: 2025-12-03 00:25:30.425 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:34 compute-1 podman[221453]: 2025-12-03 00:25:34.218064883 +0000 UTC m=+0.064979132 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Dec 03 00:25:35 compute-1 nova_compute[187157]: 2025-12-03 00:25:35.227 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-1 nova_compute[187157]: 2025-12-03 00:25:35.426 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-1 podman[197537]: time="2025-12-03T00:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:25:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:25:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2612 "" "Go-http-client/1.1"
Dec 03 00:25:37 compute-1 podman[221474]: 2025-12-03 00:25:37.207535306 +0000 UTC m=+0.051645700 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:25:40 compute-1 nova_compute[187157]: 2025-12-03 00:25:40.229 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:40 compute-1 nova_compute[187157]: 2025-12-03 00:25:40.427 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:45 compute-1 nova_compute[187157]: 2025-12-03 00:25:45.289 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:45 compute-1 nova_compute[187157]: 2025-12-03 00:25:45.429 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:46 compute-1 nova_compute[187157]: 2025-12-03 00:25:46.969 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:46 compute-1 nova_compute[187157]: 2025-12-03 00:25:46.970 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:47 compute-1 nova_compute[187157]: 2025-12-03 00:25:47.474 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:25:48 compute-1 nova_compute[187157]: 2025-12-03 00:25:48.022 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:48 compute-1 nova_compute[187157]: 2025-12-03 00:25:48.023 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:48 compute-1 nova_compute[187157]: 2025-12-03 00:25:48.028 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:25:48 compute-1 nova_compute[187157]: 2025-12-03 00:25:48.028 187161 INFO nova.compute.claims [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:25:48 compute-1 podman[221493]: 2025-12-03 00:25:48.201501702 +0000 UTC m=+0.048007031 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:25:49 compute-1 nova_compute[187157]: 2025-12-03 00:25:49.258 187161 DEBUG nova.compute.provider_tree [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: ERROR   00:25:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: ERROR   00:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: ERROR   00:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: ERROR   00:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: ERROR   00:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:25:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:25:49 compute-1 nova_compute[187157]: 2025-12-03 00:25:49.782 187161 DEBUG nova.scheduler.client.report [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:25:50 compute-1 podman[221517]: 2025-12-03 00:25:50.208661007 +0000 UTC m=+0.051568298 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:25:50 compute-1 podman[221518]: 2025-12-03 00:25:50.239561733 +0000 UTC m=+0.078876887 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.292 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.269s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.292 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.295 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.812 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.813 187161 DEBUG nova.network.neutron [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.813 187161 WARNING neutronclient.v2_0.client [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:50 compute-1 nova_compute[187157]: 2025-12-03 00:25:50.813 187161 WARNING neutronclient.v2_0.client [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:51 compute-1 nova_compute[187157]: 2025-12-03 00:25:51.321 187161 INFO nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:25:51 compute-1 nova_compute[187157]: 2025-12-03 00:25:51.828 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:25:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:52.117 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:25:52 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:25:52.117 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.117 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.343 187161 DEBUG nova.network.neutron [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Successfully created port: 9bc34fe0-a425-481e-83c0-c360b51c52bd _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.945 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.949 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.949 187161 INFO nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Creating image(s)
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.951 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "/var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.952 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.953 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.955 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.963 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:52 compute-1 nova_compute[187157]: 2025-12-03 00:25:52.967 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.059 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.060 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.061 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.062 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.067 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.067 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.135 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.137 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.173 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.174 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.174 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.243 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.244 187161 DEBUG nova.virt.disk.api [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Checking if we can resize image /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.244 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.293 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.294 187161 DEBUG nova.virt.disk.api [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Cannot resize image /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.295 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.295 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Ensure instance console log exists: /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.296 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.297 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.297 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:53 compute-1 nova_compute[187157]: 2025-12-03 00:25:53.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.250 187161 DEBUG nova.network.neutron [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Successfully updated port: 9bc34fe0-a425-481e-83c0-c360b51c52bd _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.336 187161 DEBUG nova.compute.manager [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-changed-9bc34fe0-a425-481e-83c0-c360b51c52bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.336 187161 DEBUG nova.compute.manager [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Refreshing instance network info cache due to event network-changed-9bc34fe0-a425-481e-83c0-c360b51c52bd. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.337 187161 DEBUG oslo_concurrency.lockutils [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-8e46c30c-3390-4271-98a3-af0ca5c223bd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.337 187161 DEBUG oslo_concurrency.lockutils [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-8e46c30c-3390-4271-98a3-af0ca5c223bd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.338 187161 DEBUG nova.network.neutron [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Refreshing network info cache for port 9bc34fe0-a425-481e-83c0-c360b51c52bd _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.758 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "refresh_cache-8e46c30c-3390-4271-98a3-af0ca5c223bd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.844 187161 WARNING neutronclient.v2_0.client [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:54 compute-1 nova_compute[187157]: 2025-12-03 00:25:54.929 187161 DEBUG nova.network.neutron [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:25:55 compute-1 nova_compute[187157]: 2025-12-03 00:25:55.057 187161 DEBUG nova.network.neutron [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:55 compute-1 nova_compute[187157]: 2025-12-03 00:25:55.330 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:55 compute-1 nova_compute[187157]: 2025-12-03 00:25:55.432 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:55 compute-1 nova_compute[187157]: 2025-12-03 00:25:55.563 187161 DEBUG oslo_concurrency.lockutils [req-87c61dee-1495-4494-842d-20f48d47d428 req-cf179458-0ef4-4264-acd8-af70d69eaab4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-8e46c30c-3390-4271-98a3-af0ca5c223bd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:25:55 compute-1 nova_compute[187157]: 2025-12-03 00:25:55.564 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquired lock "refresh_cache-8e46c30c-3390-4271-98a3-af0ca5c223bd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:25:55 compute-1 nova_compute[187157]: 2025-12-03 00:25:55.564 187161 DEBUG nova.network.neutron [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:25:56 compute-1 nova_compute[187157]: 2025-12-03 00:25:56.166 187161 DEBUG nova.network.neutron [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:25:56 compute-1 nova_compute[187157]: 2025-12-03 00:25:56.396 187161 WARNING neutronclient.v2_0.client [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:56 compute-1 nova_compute[187157]: 2025-12-03 00:25:56.526 187161 DEBUG nova.network.neutron [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Updating instance_info_cache with network_info: [{"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:56 compute-1 nova_compute[187157]: 2025-12-03 00:25:56.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.035 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Releasing lock "refresh_cache-8e46c30c-3390-4271-98a3-af0ca5c223bd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.036 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Instance network_info: |[{"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.038 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Start _get_guest_xml network_info=[{"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.042 187161 WARNING nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.044 187161 DEBUG nova.virt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-204412649', uuid='8e46c30c-3390-4271-98a3-af0ca5c223bd'), owner=OwnerMeta(userid='43c8524f2d244e8aa3019dd878dcfb81', username='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin', projectid='a8545a5c94f84697a8605fadf08251f7', projectname='tempest-TestExecuteZoneMigrationStrategy-558903593'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721557.0442991) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.048 187161 DEBUG nova.virt.libvirt.host [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.049 187161 DEBUG nova.virt.libvirt.host [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.052 187161 DEBUG nova.virt.libvirt.host [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.052 187161 DEBUG nova.virt.libvirt.host [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.053 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.053 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.053 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.054 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.054 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.054 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.054 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.054 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.055 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.055 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.055 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.055 187161 DEBUG nova.virt.hardware [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.058 187161 DEBUG nova.virt.libvirt.vif [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:25:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-204412649',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-204412649',id=33,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-bqgtd0h9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:25:52Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=8e46c30c-3390-4271-98a3-af0ca5c223bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.059 187161 DEBUG nova.network.os_vif_util [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.059 187161 DEBUG nova.network.os_vif_util [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.060 187161 DEBUG nova.objects.instance [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e46c30c-3390-4271-98a3-af0ca5c223bd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.586 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <uuid>8e46c30c-3390-4271-98a3-af0ca5c223bd</uuid>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <name>instance-00000021</name>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-204412649</nova:name>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:25:57</nova:creationTime>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:25:57 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:25:57 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         <nova:port uuid="9bc34fe0-a425-481e-83c0-c360b51c52bd">
Dec 03 00:25:57 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <system>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <entry name="serial">8e46c30c-3390-4271-98a3-af0ca5c223bd</entry>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <entry name="uuid">8e46c30c-3390-4271-98a3-af0ca5c223bd</entry>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </system>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <os>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </os>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <features>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </features>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.config"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:55:51:30"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <target dev="tap9bc34fe0-a4"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/console.log" append="off"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <video>
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </video>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:25:57 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:25:57 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:25:57 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:25:57 compute-1 nova_compute[187157]: </domain>
Dec 03 00:25:57 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.587 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Preparing to wait for external event network-vif-plugged-9bc34fe0-a425-481e-83c0-c360b51c52bd prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.588 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.588 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.588 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.589 187161 DEBUG nova.virt.libvirt.vif [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:25:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-204412649',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-204412649',id=33,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-bqgtd0h9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:25:52Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=8e46c30c-3390-4271-98a3-af0ca5c223bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.590 187161 DEBUG nova.network.os_vif_util [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.591 187161 DEBUG nova.network.os_vif_util [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.591 187161 DEBUG os_vif [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.592 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.592 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.593 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.594 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.594 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6e06140b-5d17-52e3-ab69-f407ef2048c9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.636 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.637 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.639 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.639 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bc34fe0-a4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.640 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9bc34fe0-a4, col_values=(('qos', UUID('2954d92f-57c1-452d-832d-976b78723d12')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.640 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9bc34fe0-a4, col_values=(('external_ids', {'iface-id': '9bc34fe0-a425-481e-83c0-c360b51c52bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:51:30', 'vm-uuid': '8e46c30c-3390-4271-98a3-af0ca5c223bd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.641 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 NetworkManager[55553]: <info>  [1764721557.6418] manager: (tap9bc34fe0-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.642 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.645 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:57 compute-1 nova_compute[187157]: 2025-12-03 00:25:57.646 187161 INFO os_vif [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4')
Dec 03 00:25:58 compute-1 nova_compute[187157]: 2025-12-03 00:25:58.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.203 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.204 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.204 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No VIF found with MAC fa:16:3e:55:51:30, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.205 187161 INFO nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Using config drive
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.211 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.212 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.213 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:25:59 compute-1 nova_compute[187157]: 2025-12-03 00:25:59.720 187161 WARNING neutronclient.v2_0.client [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.181 187161 INFO nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Creating config drive at /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.config
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.192 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6na5hsei execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.253 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.320 187161 DEBUG oslo_concurrency.processutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6na5hsei" returned: 0 in 0.128s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.325 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.326 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.334 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.378 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-1 kernel: tap9bc34fe0-a4: entered promiscuous mode
Dec 03 00:26:00 compute-1 ovn_controller[95464]: 2025-12-03T00:26:00Z|00289|binding|INFO|Claiming lport 9bc34fe0-a425-481e-83c0-c360b51c52bd for this chassis.
Dec 03 00:26:00 compute-1 ovn_controller[95464]: 2025-12-03T00:26:00Z|00290|binding|INFO|9bc34fe0-a425-481e-83c0-c360b51c52bd: Claiming fa:16:3e:55:51:30 10.100.0.14
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.395 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:00 compute-1 NetworkManager[55553]: <info>  [1764721560.3986] manager: (tap9bc34fe0-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.404 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:51:30 10.100.0.14'], port_security=['fa:16:3e:55:51:30 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8e46c30c-3390-4271-98a3-af0ca5c223bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=9bc34fe0-a425-481e-83c0-c360b51c52bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.405 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 9bc34fe0-a425-481e-83c0-c360b51c52bd in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 bound to our chassis
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.406 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:26:00 compute-1 ovn_controller[95464]: 2025-12-03T00:26:00Z|00291|binding|INFO|Setting lport 9bc34fe0-a425-481e-83c0-c360b51c52bd ovn-installed in OVS
Dec 03 00:26:00 compute-1 ovn_controller[95464]: 2025-12-03T00:26:00Z|00292|binding|INFO|Setting lport 9bc34fe0-a425-481e-83c0-c360b51c52bd up in Southbound
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.414 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.418 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.424 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ea52c2dc-c5d8-4525-ae2b-145d670b9809]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.425 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7a76663-51 in ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.428 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7a76663-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.428 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2b363633-6d98-494f-b91e-27052f310943]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.429 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[9b0f8647-8801-4499-af7f-95be9a3be759]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 systemd-udevd[221604]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.440 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[da123363-fc07-4f68-88b6-50266f663a09]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 NetworkManager[55553]: <info>  [1764721560.4463] device (tap9bc34fe0-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:26:00 compute-1 NetworkManager[55553]: <info>  [1764721560.4472] device (tap9bc34fe0-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:26:00 compute-1 systemd-machined[153454]: New machine qemu-27-instance-00000021.
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.458 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a680a5d3-a07c-4582-b0fb-6ca5e8199c07]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 systemd[1]: Started Virtual Machine qemu-27-instance-00000021.
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.486 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cc7371-7822-4013-8f54-6734b412740b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.490 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8f1f40-8a41-46ee-b8c2-684452615b36]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 NetworkManager[55553]: <info>  [1764721560.4917] manager: (tapf7a76663-50): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.521 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb0fdbd-1114-4f14-8955-fe4c131993e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.523 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e5b7f4-a64f-40e3-b36b-810ff942adc6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 NetworkManager[55553]: <info>  [1764721560.5470] device (tapf7a76663-50): carrier: link connected
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.553 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[a02bfb77-48cb-496c-9658-d87406ff013a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.571 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[79b6a7ad-51f3-46a5-9df8-174964cdab94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558020, 'reachable_time': 38335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221637, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.587 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[be0038f1-44cf-4a66-9b8a-a4ac73394a4b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:f9e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558020, 'tstamp': 558020}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221638, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.604 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.605 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.607 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6460a26d-789b-460b-8a9d-c70304708ebd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558020, 'reachable_time': 38335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221639, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.636 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b082d50c-9d02-4a84-9c66-8db495269ed5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.639 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.639 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5804MB free_disk=73.16073989868164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.640 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.640 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.695 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fb2d000a-fd1c-4037-a0f3-1aed802b45c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.697 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.697 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.697 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:00 compute-1 NetworkManager[55553]: <info>  [1764721560.6999] manager: (tapf7a76663-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Dec 03 00:26:00 compute-1 kernel: tapf7a76663-50: entered promiscuous mode
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.702 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:00 compute-1 ovn_controller[95464]: 2025-12-03T00:26:00Z|00293|binding|INFO|Releasing lport 45446e36-d2c9-4ea6-b9fb-83e2711350dd from this chassis (sb_readonly=0)
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.705 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[42eb8bee-b710-4d78-a1b1-6e84e8afb6bf]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.706 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.706 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.706 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.706 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.706 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5475cd53-d4ce-42b6-91ed-08c02bd1241f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.707 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.707 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a46a04c1-6073-49d0-aadc-d3c5abf35224]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.707 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:26:00 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:00.708 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'env', 'PROCESS_TAG=haproxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:26:00 compute-1 nova_compute[187157]: 2025-12-03 00:26:00.699 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:01 compute-1 podman[221676]: 2025-12-03 00:26:01.062492555 +0000 UTC m=+0.048922043 container create c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:26:01 compute-1 systemd[1]: Started libpod-conmon-c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042.scope.
Dec 03 00:26:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:01.119 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:01 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:26:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64fcf8e8c97ba8825e177563d0af97e0ea0a7b8199dbf430b56774574731b73f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:26:01 compute-1 podman[221676]: 2025-12-03 00:26:01.033765831 +0000 UTC m=+0.020195319 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:26:01 compute-1 podman[221676]: 2025-12-03 00:26:01.13674514 +0000 UTC m=+0.123174648 container init c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Dec 03 00:26:01 compute-1 podman[221676]: 2025-12-03 00:26:01.141165347 +0000 UTC m=+0.127594825 container start c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:26:01 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [NOTICE]   (221695) : New worker (221697) forked
Dec 03 00:26:01 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [NOTICE]   (221695) : Loading success.
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.248 187161 DEBUG nova.compute.manager [req-f16947ea-70fd-44c6-97dd-e950745f2e2f req-f8081800-284e-43e6-8aa4-254b35ea1a9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-plugged-9bc34fe0-a425-481e-83c0-c360b51c52bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.249 187161 DEBUG oslo_concurrency.lockutils [req-f16947ea-70fd-44c6-97dd-e950745f2e2f req-f8081800-284e-43e6-8aa4-254b35ea1a9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.249 187161 DEBUG oslo_concurrency.lockutils [req-f16947ea-70fd-44c6-97dd-e950745f2e2f req-f8081800-284e-43e6-8aa4-254b35ea1a9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.250 187161 DEBUG oslo_concurrency.lockutils [req-f16947ea-70fd-44c6-97dd-e950745f2e2f req-f8081800-284e-43e6-8aa4-254b35ea1a9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.250 187161 DEBUG nova.compute.manager [req-f16947ea-70fd-44c6-97dd-e950745f2e2f req-f8081800-284e-43e6-8aa4-254b35ea1a9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Processing event network-vif-plugged-9bc34fe0-a425-481e-83c0-c360b51c52bd _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.251 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.256 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.261 187161 INFO nova.virt.libvirt.driver [-] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Instance spawned successfully.
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.262 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.687 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 8e46c30c-3390-4271-98a3-af0ca5c223bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.688 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.688 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:26:00 up  1:33,  0 user,  load average: 0.27, 0.19, 0.24\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_a8545a5c94f84697a8605fadf08251f7': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.715 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:26:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:01.754 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:01.754 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:01.754 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.773 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.774 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.774 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.775 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.775 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:26:01 compute-1 nova_compute[187157]: 2025-12-03 00:26:01.776 187161 DEBUG nova.virt.libvirt.driver [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.221 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.287 187161 INFO nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Took 9.34 seconds to spawn the instance on the hypervisor.
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.288 187161 DEBUG nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.643 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.733 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.734 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:02 compute-1 nova_compute[187157]: 2025-12-03 00:26:02.921 187161 INFO nova.compute.manager [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Took 14.94 seconds to build instance.
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.460 187161 DEBUG nova.compute.manager [req-b99a7d21-8654-4118-ab0d-1396f855a348 req-f49d1bf8-0e42-4cdd-9df0-ada3707892fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-plugged-9bc34fe0-a425-481e-83c0-c360b51c52bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.461 187161 DEBUG oslo_concurrency.lockutils [req-b99a7d21-8654-4118-ab0d-1396f855a348 req-f49d1bf8-0e42-4cdd-9df0-ada3707892fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.462 187161 DEBUG oslo_concurrency.lockutils [req-b99a7d21-8654-4118-ab0d-1396f855a348 req-f49d1bf8-0e42-4cdd-9df0-ada3707892fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.463 187161 DEBUG oslo_concurrency.lockutils [req-b99a7d21-8654-4118-ab0d-1396f855a348 req-f49d1bf8-0e42-4cdd-9df0-ada3707892fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.463 187161 DEBUG nova.compute.manager [req-b99a7d21-8654-4118-ab0d-1396f855a348 req-f49d1bf8-0e42-4cdd-9df0-ada3707892fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] No waiting events found dispatching network-vif-plugged-9bc34fe0-a425-481e-83c0-c360b51c52bd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.464 187161 WARNING nova.compute.manager [req-b99a7d21-8654-4118-ab0d-1396f855a348 req-f49d1bf8-0e42-4cdd-9df0-ada3707892fa 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received unexpected event network-vif-plugged-9bc34fe0-a425-481e-83c0-c360b51c52bd for instance with vm_state active and task_state None.
Dec 03 00:26:03 compute-1 nova_compute[187157]: 2025-12-03 00:26:03.488 187161 DEBUG oslo_concurrency.lockutils [None req-ebf76032-2bce-42f5-8a3b-99789c572053 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.518s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:05 compute-1 podman[221707]: 2025-12-03 00:26:05.253853072 +0000 UTC m=+0.078123189 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, version=9.6, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:26:05 compute-1 nova_compute[187157]: 2025-12-03 00:26:05.333 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:05 compute-1 podman[197537]: time="2025-12-03T00:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:26:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:26:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3079 "" "Go-http-client/1.1"
Dec 03 00:26:05 compute-1 nova_compute[187157]: 2025-12-03 00:26:05.729 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:05 compute-1 nova_compute[187157]: 2025-12-03 00:26:05.730 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:05 compute-1 nova_compute[187157]: 2025-12-03 00:26:05.731 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:26:06 compute-1 nova_compute[187157]: 2025-12-03 00:26:06.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:07 compute-1 nova_compute[187157]: 2025-12-03 00:26:07.688 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:08 compute-1 podman[221726]: 2025-12-03 00:26:08.211209868 +0000 UTC m=+0.052583621 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:26:10 compute-1 nova_compute[187157]: 2025-12-03 00:26:10.335 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:10 compute-1 nova_compute[187157]: 2025-12-03 00:26:10.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:12 compute-1 nova_compute[187157]: 2025-12-03 00:26:12.693 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:14 compute-1 ovn_controller[95464]: 2025-12-03T00:26:14Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:55:51:30 10.100.0.14
Dec 03 00:26:14 compute-1 ovn_controller[95464]: 2025-12-03T00:26:14Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:51:30 10.100.0.14
Dec 03 00:26:15 compute-1 nova_compute[187157]: 2025-12-03 00:26:15.406 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:17 compute-1 nova_compute[187157]: 2025-12-03 00:26:17.701 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:18 compute-1 nova_compute[187157]: 2025-12-03 00:26:18.614 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Creating tmpfile /var/lib/nova/instances/tmpp2pv6ixr to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:26:18 compute-1 nova_compute[187157]: 2025-12-03 00:26:18.614 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:18 compute-1 nova_compute[187157]: 2025-12-03 00:26:18.625 187161 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:26:18 compute-1 podman[221759]: 2025-12-03 00:26:18.694943015 +0000 UTC m=+0.045615484 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: ERROR   00:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: ERROR   00:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: ERROR   00:26:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: ERROR   00:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: ERROR   00:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:26:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:26:20 compute-1 nova_compute[187157]: 2025-12-03 00:26:20.407 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:20 compute-1 nova_compute[187157]: 2025-12-03 00:26:20.661 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:21 compute-1 podman[221785]: 2025-12-03 00:26:21.219237086 +0000 UTC m=+0.058405123 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 03 00:26:21 compute-1 podman[221786]: 2025-12-03 00:26:21.249302062 +0000 UTC m=+0.088401397 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 03 00:26:22 compute-1 nova_compute[187157]: 2025-12-03 00:26:22.724 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:25 compute-1 nova_compute[187157]: 2025-12-03 00:26:25.466 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:26 compute-1 nova_compute[187157]: 2025-12-03 00:26:26.578 187161 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='20d06540-44a6-4c4c-ab2f-d4997af86fa0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:26:27 compute-1 nova_compute[187157]: 2025-12-03 00:26:27.728 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:28 compute-1 nova_compute[187157]: 2025-12-03 00:26:28.845 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:26:28 compute-1 nova_compute[187157]: 2025-12-03 00:26:28.846 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:26:28 compute-1 nova_compute[187157]: 2025-12-03 00:26:28.846 187161 DEBUG nova.network.neutron [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:26:30 compute-1 ovn_controller[95464]: 2025-12-03T00:26:30Z|00294|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec 03 00:26:30 compute-1 nova_compute[187157]: 2025-12-03 00:26:30.469 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:32 compute-1 nova_compute[187157]: 2025-12-03 00:26:32.579 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:32 compute-1 nova_compute[187157]: 2025-12-03 00:26:32.731 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:32 compute-1 nova_compute[187157]: 2025-12-03 00:26:32.945 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.138 187161 DEBUG nova.network.neutron [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.647 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.665 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='20d06540-44a6-4c4c-ab2f-d4997af86fa0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.666 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Creating instance directory: /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.667 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Creating disk.info with the contents: {'/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk': 'qcow2', '/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.668 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:26:34 compute-1 nova_compute[187157]: 2025-12-03 00:26:34.669 187161 DEBUG nova.objects.instance [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 20d06540-44a6-4c4c-ab2f-d4997af86fa0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.177 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.180 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.181 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.237 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.238 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.238 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.239 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.242 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.243 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.326 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.327 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.471 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.479 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk 1073741824" returned: 0 in 0.151s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.480 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.242s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.481 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.536 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.538 187161 DEBUG nova.virt.disk.api [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.538 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.591 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.592 187161 DEBUG nova.virt.disk.api [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:26:35 compute-1 nova_compute[187157]: 2025-12-03 00:26:35.593 187161 DEBUG nova.objects.instance [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 20d06540-44a6-4c4c-ab2f-d4997af86fa0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:26:35 compute-1 podman[197537]: time="2025-12-03T00:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:26:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:26:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3084 "" "Go-http-client/1.1"
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.107 187161 DEBUG nova.objects.base [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<20d06540-44a6-4c4c-ab2f-d4997af86fa0> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.107 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.143 187161 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config 497664" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.144 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.145 187161 DEBUG nova.virt.libvirt.vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1695178384',id=32,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:25:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-sxcn790n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:25:40Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.146 187161 DEBUG nova.network.os_vif_util [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.146 187161 DEBUG nova.network.os_vif_util [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.147 187161 DEBUG os_vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.147 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.147 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.148 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.148 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.149 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8b78b72f-5a3b-542b-b870-cfef03cf2b5f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.150 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.151 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.154 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.154 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap697e2ff1-39, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.154 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap697e2ff1-39, col_values=(('qos', UUID('3211cfe7-779a-44b8-886d-4f1ae3c3a89e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.154 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap697e2ff1-39, col_values=(('external_ids', {'iface-id': '697e2ff1-393b-4c81-abc1-b7afc93f0e5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:af:d0', 'vm-uuid': '20d06540-44a6-4c4c-ab2f-d4997af86fa0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.155 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 NetworkManager[55553]: <info>  [1764721596.1568] manager: (tap697e2ff1-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.158 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.162 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.162 187161 INFO os_vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39')
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.162 187161 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.163 187161 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='20d06540-44a6-4c4c-ab2f-d4997af86fa0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.163 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:36 compute-1 podman[221848]: 2025-12-03 00:26:36.202577604 +0000 UTC m=+0.050448050 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.277 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:36.570 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:26:36 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:36.571 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:26:36 compute-1 nova_compute[187157]: 2025-12-03 00:26:36.571 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:37 compute-1 nova_compute[187157]: 2025-12-03 00:26:37.439 187161 DEBUG nova.network.neutron [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:26:37 compute-1 nova_compute[187157]: 2025-12-03 00:26:37.451 187161 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='20d06540-44a6-4c4c-ab2f-d4997af86fa0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:26:39 compute-1 podman[221874]: 2025-12-03 00:26:39.227066062 +0000 UTC m=+0.064495509 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 03 00:26:40 compute-1 nova_compute[187157]: 2025-12-03 00:26:40.473 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-1 kernel: tap697e2ff1-39: entered promiscuous mode
Dec 03 00:26:40 compute-1 NetworkManager[55553]: <info>  [1764721600.7356] manager: (tap697e2ff1-39): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Dec 03 00:26:40 compute-1 nova_compute[187157]: 2025-12-03 00:26:40.736 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-1 ovn_controller[95464]: 2025-12-03T00:26:40Z|00295|binding|INFO|Claiming lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b for this additional chassis.
Dec 03 00:26:40 compute-1 ovn_controller[95464]: 2025-12-03T00:26:40Z|00296|binding|INFO|697e2ff1-393b-4c81-abc1-b7afc93f0e5b: Claiming fa:16:3e:d1:af:d0 10.100.0.11
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.746 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:af:d0 10.100.0.11'], port_security=['fa:16:3e:d1:af:d0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '20d06540-44a6-4c4c-ab2f-d4997af86fa0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '10', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=697e2ff1-393b-4c81-abc1-b7afc93f0e5b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.747 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.749 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:26:40 compute-1 ovn_controller[95464]: 2025-12-03T00:26:40Z|00297|binding|INFO|Setting lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b ovn-installed in OVS
Dec 03 00:26:40 compute-1 nova_compute[187157]: 2025-12-03 00:26:40.750 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-1 nova_compute[187157]: 2025-12-03 00:26:40.754 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.772 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ab3b65-9601-4fe5-909f-fe43f9669e13]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:40 compute-1 systemd-machined[153454]: New machine qemu-28-instance-00000020.
Dec 03 00:26:40 compute-1 systemd[1]: Started Virtual Machine qemu-28-instance-00000020.
Dec 03 00:26:40 compute-1 systemd-udevd[221912]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.805 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[1306cd48-32f9-4c1f-beff-6b398513dcb9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.808 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[26d3e678-45f2-4fde-b797-3be9fec89edc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:40 compute-1 NetworkManager[55553]: <info>  [1764721600.8124] device (tap697e2ff1-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:26:40 compute-1 NetworkManager[55553]: <info>  [1764721600.8138] device (tap697e2ff1-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.837 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[de15f38c-2cf1-4e48-a08f-7b63a7657fb5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.853 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e03638-b0d1-43db-b039-8e3f7a039e81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558020, 'reachable_time': 38335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221921, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.867 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[73189d54-8cad-4b80-b30f-92992f8c0302]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558031, 'tstamp': 558031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221923, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558034, 'tstamp': 558034}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221923, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.869 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:40 compute-1 nova_compute[187157]: 2025-12-03 00:26:40.870 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-1 nova_compute[187157]: 2025-12-03 00:26:40.871 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.871 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.872 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.872 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.872 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:26:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:40.874 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f3529b-5c9a-439e-92df-05686b090443]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:41 compute-1 nova_compute[187157]: 2025-12-03 00:26:41.155 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:45 compute-1 nova_compute[187157]: 2025-12-03 00:26:45.542 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:46 compute-1 nova_compute[187157]: 2025-12-03 00:26:46.158 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:46 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:26:46.573 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:47 compute-1 ovn_controller[95464]: 2025-12-03T00:26:47Z|00298|binding|INFO|Claiming lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b for this chassis.
Dec 03 00:26:47 compute-1 ovn_controller[95464]: 2025-12-03T00:26:47Z|00299|binding|INFO|697e2ff1-393b-4c81-abc1-b7afc93f0e5b: Claiming fa:16:3e:d1:af:d0 10.100.0.11
Dec 03 00:26:47 compute-1 ovn_controller[95464]: 2025-12-03T00:26:47Z|00300|binding|INFO|Setting lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b up in Southbound
Dec 03 00:26:49 compute-1 podman[221947]: 2025-12-03 00:26:49.230797517 +0000 UTC m=+0.064371976 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:26:49 compute-1 nova_compute[187157]: 2025-12-03 00:26:49.346 187161 INFO nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Post operation of migration started
Dec 03 00:26:49 compute-1 nova_compute[187157]: 2025-12-03 00:26:49.346 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: ERROR   00:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: ERROR   00:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: ERROR   00:26:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: ERROR   00:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: ERROR   00:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:26:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:26:49 compute-1 nova_compute[187157]: 2025-12-03 00:26:49.454 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:49 compute-1 nova_compute[187157]: 2025-12-03 00:26:49.454 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:50 compute-1 nova_compute[187157]: 2025-12-03 00:26:50.146 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:26:50 compute-1 nova_compute[187157]: 2025-12-03 00:26:50.146 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:26:50 compute-1 nova_compute[187157]: 2025-12-03 00:26:50.146 187161 DEBUG nova.network.neutron [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:26:50 compute-1 nova_compute[187157]: 2025-12-03 00:26:50.579 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:50 compute-1 nova_compute[187157]: 2025-12-03 00:26:50.660 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:51 compute-1 nova_compute[187157]: 2025-12-03 00:26:51.159 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:51 compute-1 nova_compute[187157]: 2025-12-03 00:26:51.586 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:51 compute-1 nova_compute[187157]: 2025-12-03 00:26:51.755 187161 DEBUG nova.network.neutron [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:26:52 compute-1 podman[221970]: 2025-12-03 00:26:52.207697926 +0000 UTC m=+0.047724145 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 03 00:26:52 compute-1 podman[221971]: 2025-12-03 00:26:52.255610253 +0000 UTC m=+0.089633557 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:26:52 compute-1 nova_compute[187157]: 2025-12-03 00:26:52.263 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:26:52 compute-1 nova_compute[187157]: 2025-12-03 00:26:52.779 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:52 compute-1 nova_compute[187157]: 2025-12-03 00:26:52.780 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:52 compute-1 nova_compute[187157]: 2025-12-03 00:26:52.780 187161 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:52 compute-1 nova_compute[187157]: 2025-12-03 00:26:52.784 187161 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:26:52 compute-1 virtqemud[186882]: Domain id=28 name='instance-00000020' uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0 is tainted: custom-monitor
Dec 03 00:26:53 compute-1 nova_compute[187157]: 2025-12-03 00:26:53.790 187161 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:26:54 compute-1 nova_compute[187157]: 2025-12-03 00:26:54.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:54 compute-1 nova_compute[187157]: 2025-12-03 00:26:54.797 187161 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:26:54 compute-1 nova_compute[187157]: 2025-12-03 00:26:54.803 187161 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:26:55 compute-1 nova_compute[187157]: 2025-12-03 00:26:55.314 187161 DEBUG nova.objects.instance [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:26:55 compute-1 nova_compute[187157]: 2025-12-03 00:26:55.602 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:55 compute-1 nova_compute[187157]: 2025-12-03 00:26:55.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:56 compute-1 nova_compute[187157]: 2025-12-03 00:26:56.160 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:56 compute-1 nova_compute[187157]: 2025-12-03 00:26:56.763 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:56 compute-1 nova_compute[187157]: 2025-12-03 00:26:56.871 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:56 compute-1 nova_compute[187157]: 2025-12-03 00:26:56.871 187161 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:57 compute-1 nova_compute[187157]: 2025-12-03 00:26:57.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:59 compute-1 nova_compute[187157]: 2025-12-03 00:26:59.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:00 compute-1 nova_compute[187157]: 2025-12-03 00:27:00.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:00 compute-1 nova_compute[187157]: 2025-12-03 00:27:00.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:00 compute-1 nova_compute[187157]: 2025-12-03 00:27:00.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:00 compute-1 nova_compute[187157]: 2025-12-03 00:27:00.215 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:27:00 compute-1 nova_compute[187157]: 2025-12-03 00:27:00.606 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.161 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.264 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.338 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.338 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.387 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.388 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.388 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.389 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.389 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.392 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.401 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.411 187161 INFO nova.compute.manager [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Terminating instance
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.455 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.455 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.510 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.649 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.650 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.667 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.668 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5488MB free_disk=73.1032829284668GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.668 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.668 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:01.755 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:01.755 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:01.756 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.941 187161 DEBUG nova.compute.manager [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:27:01 compute-1 kernel: tap9bc34fe0-a4 (unregistering): left promiscuous mode
Dec 03 00:27:01 compute-1 NetworkManager[55553]: <info>  [1764721621.9707] device (tap9bc34fe0-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.974 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:01 compute-1 ovn_controller[95464]: 2025-12-03T00:27:01Z|00301|binding|INFO|Releasing lport 9bc34fe0-a425-481e-83c0-c360b51c52bd from this chassis (sb_readonly=0)
Dec 03 00:27:01 compute-1 ovn_controller[95464]: 2025-12-03T00:27:01Z|00302|binding|INFO|Setting lport 9bc34fe0-a425-481e-83c0-c360b51c52bd down in Southbound
Dec 03 00:27:01 compute-1 ovn_controller[95464]: 2025-12-03T00:27:01Z|00303|binding|INFO|Removing iface tap9bc34fe0-a4 ovn-installed in OVS
Dec 03 00:27:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:01.995 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:51:30 10.100.0.14'], port_security=['fa:16:3e:55:51:30 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8e46c30c-3390-4271-98a3-af0ca5c223bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '5', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=9bc34fe0-a425-481e-83c0-c360b51c52bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:01.996 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 9bc34fe0-a425-481e-83c0-c360b51c52bd in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:27:01 compute-1 nova_compute[187157]: 2025-12-03 00:27:01.996 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:01.998 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.015 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1ca664-9dfe-41ed-bef2-0d3cd05d81db]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000021.scope: Deactivated successfully.
Dec 03 00:27:02 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000021.scope: Consumed 14.513s CPU time.
Dec 03 00:27:02 compute-1 systemd-machined[153454]: Machine qemu-27-instance-00000021 terminated.
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.041 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[81bec379-8b29-4e92-8baf-439f3544c56d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.046 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[5d269571-1327-44ce-b4f1-8fac40e05d1a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.073 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[9138f694-ba78-4e2c-bcef-53f96a997b5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.088 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[2883f6fb-8779-44ee-b387-f18732df30aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558020, 'reachable_time': 38335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222044, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.105 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[94fc1253-97b7-42a3-907b-d87284771dfe]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558031, 'tstamp': 558031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222045, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7a76663-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558034, 'tstamp': 558034}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222045, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.105 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.107 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.111 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.112 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.112 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.112 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.112 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:27:02 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:02.114 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ce3f1faa-13bb-4740-94cf-247929807d8d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.204 187161 INFO nova.virt.libvirt.driver [-] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Instance destroyed successfully.
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.204 187161 DEBUG nova.objects.instance [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'resources' on Instance uuid 8e46c30c-3390-4271-98a3-af0ca5c223bd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.710 187161 DEBUG nova.compute.manager [req-e168fd55-4de0-48c3-b75d-80ac764ce29e req-e5bca3b1-8500-478c-985c-ce72b0abe565 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-unplugged-9bc34fe0-a425-481e-83c0-c360b51c52bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.710 187161 DEBUG oslo_concurrency.lockutils [req-e168fd55-4de0-48c3-b75d-80ac764ce29e req-e5bca3b1-8500-478c-985c-ce72b0abe565 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.710 187161 DEBUG oslo_concurrency.lockutils [req-e168fd55-4de0-48c3-b75d-80ac764ce29e req-e5bca3b1-8500-478c-985c-ce72b0abe565 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.711 187161 DEBUG oslo_concurrency.lockutils [req-e168fd55-4de0-48c3-b75d-80ac764ce29e req-e5bca3b1-8500-478c-985c-ce72b0abe565 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.711 187161 DEBUG nova.compute.manager [req-e168fd55-4de0-48c3-b75d-80ac764ce29e req-e5bca3b1-8500-478c-985c-ce72b0abe565 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] No waiting events found dispatching network-vif-unplugged-9bc34fe0-a425-481e-83c0-c360b51c52bd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:27:02 compute-1 nova_compute[187157]: 2025-12-03 00:27:02.711 187161 DEBUG nova.compute.manager [req-e168fd55-4de0-48c3-b75d-80ac764ce29e req-e5bca3b1-8500-478c-985c-ce72b0abe565 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-unplugged-9bc34fe0-a425-481e-83c0-c360b51c52bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.667 187161 DEBUG nova.virt.libvirt.vif [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:25:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-204412649',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-204412649',id=33,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:26:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-bqgtd0h9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:26:02Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=8e46c30c-3390-4271-98a3-af0ca5c223bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.668 187161 DEBUG nova.network.os_vif_util [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "address": "fa:16:3e:55:51:30", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bc34fe0-a4", "ovs_interfaceid": "9bc34fe0-a425-481e-83c0-c360b51c52bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.668 187161 DEBUG nova.network.os_vif_util [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.669 187161 DEBUG os_vif [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.672 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.672 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bc34fe0-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.673 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.675 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.675 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.676 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.676 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2954d92f-57c1-452d-832d-976b78723d12) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.677 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.677 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.679 187161 INFO os_vif [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:51:30,bridge_name='br-int',has_traffic_filtering=True,id=9bc34fe0-a425-481e-83c0-c360b51c52bd,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bc34fe0-a4')
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.680 187161 INFO nova.virt.libvirt.driver [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Deleting instance files /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd_del
Dec 03 00:27:03 compute-1 nova_compute[187157]: 2025-12-03 00:27:03.680 187161 INFO nova.virt.libvirt.driver [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Deletion of /var/lib/nova/instances/8e46c30c-3390-4271-98a3-af0ca5c223bd_del complete
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.191 187161 INFO nova.compute.manager [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Took 2.25 seconds to destroy the instance on the hypervisor.
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.191 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.191 187161 DEBUG nova.compute.manager [-] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.191 187161 DEBUG nova.network.neutron [-] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.192 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.215 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 8e46c30c-3390-4271-98a3-af0ca5c223bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.215 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 20d06540-44a6-4c4c-ab2f-d4997af86fa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.215 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.216 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:27:01 up  1:34,  0 user,  load average: 0.53, 0.29, 0.27\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_a8545a5c94f84697a8605fadf08251f7': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.340 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.344 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.768 187161 DEBUG nova.compute.manager [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-unplugged-9bc34fe0-a425-481e-83c0-c360b51c52bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.768 187161 DEBUG oslo_concurrency.lockutils [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.768 187161 DEBUG oslo_concurrency.lockutils [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.769 187161 DEBUG oslo_concurrency.lockutils [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.769 187161 DEBUG nova.compute.manager [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] No waiting events found dispatching network-vif-unplugged-9bc34fe0-a425-481e-83c0-c360b51c52bd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.769 187161 DEBUG nova.compute.manager [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-unplugged-9bc34fe0-a425-481e-83c0-c360b51c52bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.769 187161 DEBUG nova.compute.manager [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Received event network-vif-deleted-9bc34fe0-a425-481e-83c0-c360b51c52bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.770 187161 INFO nova.compute.manager [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Neutron deleted interface 9bc34fe0-a425-481e-83c0-c360b51c52bd; detaching it from the instance and deleting it from the info cache
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.770 187161 DEBUG nova.network.neutron [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:27:04 compute-1 nova_compute[187157]: 2025-12-03 00:27:04.849 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:27:05 compute-1 nova_compute[187157]: 2025-12-03 00:27:05.167 187161 DEBUG nova.network.neutron [-] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:27:05 compute-1 nova_compute[187157]: 2025-12-03 00:27:05.279 187161 DEBUG nova.compute.manager [req-45bd4970-a10d-4be1-baf4-cb627b9d26bc req-4654b31c-8a31-4a01-9667-5792a3974e15 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Detach interface failed, port_id=9bc34fe0-a425-481e-83c0-c360b51c52bd, reason: Instance 8e46c30c-3390-4271-98a3-af0ca5c223bd could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:27:05 compute-1 nova_compute[187157]: 2025-12-03 00:27:05.360 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:27:05 compute-1 nova_compute[187157]: 2025-12-03 00:27:05.360 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.692s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:05 compute-1 nova_compute[187157]: 2025-12-03 00:27:05.608 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:05 compute-1 podman[197537]: time="2025-12-03T00:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:27:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:27:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Dec 03 00:27:05 compute-1 nova_compute[187157]: 2025-12-03 00:27:05.677 187161 INFO nova.compute.manager [-] [instance: 8e46c30c-3390-4271-98a3-af0ca5c223bd] Took 1.49 seconds to deallocate network for instance.
Dec 03 00:27:06 compute-1 nova_compute[187157]: 2025-12-03 00:27:06.205 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:06 compute-1 nova_compute[187157]: 2025-12-03 00:27:06.206 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:06 compute-1 nova_compute[187157]: 2025-12-03 00:27:06.263 187161 DEBUG nova.compute.provider_tree [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:27:06 compute-1 nova_compute[187157]: 2025-12-03 00:27:06.775 187161 DEBUG nova.scheduler.client.report [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:27:07 compute-1 podman[222064]: 2025-12-03 00:27:07.235902128 +0000 UTC m=+0.075222668 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Dec 03 00:27:07 compute-1 nova_compute[187157]: 2025-12-03 00:27:07.287 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:07 compute-1 nova_compute[187157]: 2025-12-03 00:27:07.306 187161 INFO nova.scheduler.client.report [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Deleted allocations for instance 8e46c30c-3390-4271-98a3-af0ca5c223bd
Dec 03 00:27:07 compute-1 sshd-session[222083]: Invalid user solana from 45.148.10.240 port 54294
Dec 03 00:27:07 compute-1 sshd-session[222083]: Connection closed by invalid user solana 45.148.10.240 port 54294 [preauth]
Dec 03 00:27:08 compute-1 nova_compute[187157]: 2025-12-03 00:27:08.356 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:08 compute-1 nova_compute[187157]: 2025-12-03 00:27:08.356 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:08 compute-1 nova_compute[187157]: 2025-12-03 00:27:08.356 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:08 compute-1 nova_compute[187157]: 2025-12-03 00:27:08.356 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:27:08 compute-1 nova_compute[187157]: 2025-12-03 00:27:08.357 187161 DEBUG oslo_concurrency.lockutils [None req-520ce63d-72d2-4b11-9805-2a98d1231731 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "8e46c30c-3390-4271-98a3-af0ca5c223bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.969s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:08 compute-1 nova_compute[187157]: 2025-12-03 00:27:08.710 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:09 compute-1 nova_compute[187157]: 2025-12-03 00:27:09.817 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:09 compute-1 nova_compute[187157]: 2025-12-03 00:27:09.818 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:09 compute-1 nova_compute[187157]: 2025-12-03 00:27:09.818 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:09 compute-1 nova_compute[187157]: 2025-12-03 00:27:09.819 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:09 compute-1 nova_compute[187157]: 2025-12-03 00:27:09.819 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:09 compute-1 nova_compute[187157]: 2025-12-03 00:27:09.835 187161 INFO nova.compute.manager [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Terminating instance
Dec 03 00:27:10 compute-1 podman[222089]: 2025-12-03 00:27:10.236224732 +0000 UTC m=+0.069418198 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.357 187161 DEBUG nova.compute.manager [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:27:10 compute-1 kernel: tap697e2ff1-39 (unregistering): left promiscuous mode
Dec 03 00:27:10 compute-1 NetworkManager[55553]: <info>  [1764721630.3797] device (tap697e2ff1-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.384 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 ovn_controller[95464]: 2025-12-03T00:27:10Z|00304|binding|INFO|Releasing lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b from this chassis (sb_readonly=0)
Dec 03 00:27:10 compute-1 ovn_controller[95464]: 2025-12-03T00:27:10Z|00305|binding|INFO|Setting lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b down in Southbound
Dec 03 00:27:10 compute-1 ovn_controller[95464]: 2025-12-03T00:27:10Z|00306|binding|INFO|Removing iface tap697e2ff1-39 ovn-installed in OVS
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.385 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.392 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:af:d0 10.100.0.11'], port_security=['fa:16:3e:d1:af:d0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '20d06540-44a6-4c4c-ab2f-d4997af86fa0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '14', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=697e2ff1-393b-4c81-abc1-b7afc93f0e5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.393 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.394 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.395 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[09589112-e15a-47f3-a604-a1de97bd435a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.395 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace which is not needed anymore
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.400 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000020.scope: Deactivated successfully.
Dec 03 00:27:10 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000020.scope: Consumed 3.074s CPU time.
Dec 03 00:27:10 compute-1 systemd-machined[153454]: Machine qemu-28-instance-00000020 terminated.
Dec 03 00:27:10 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [NOTICE]   (221695) : haproxy version is 3.0.5-8e879a5
Dec 03 00:27:10 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [NOTICE]   (221695) : path to executable is /usr/sbin/haproxy
Dec 03 00:27:10 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [WARNING]  (221695) : Exiting Master process...
Dec 03 00:27:10 compute-1 podman[222135]: 2025-12-03 00:27:10.497776543 +0000 UTC m=+0.027581108 container kill c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:27:10 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [ALERT]    (221695) : Current worker (221697) exited with code 143 (Terminated)
Dec 03 00:27:10 compute-1 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221691]: [WARNING]  (221695) : All workers exited. Exiting... (0)
Dec 03 00:27:10 compute-1 systemd[1]: libpod-c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042.scope: Deactivated successfully.
Dec 03 00:27:10 compute-1 podman[222150]: 2025-12-03 00:27:10.533297991 +0000 UTC m=+0.020370843 container died c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:27:10 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042-userdata-shm.mount: Deactivated successfully.
Dec 03 00:27:10 compute-1 systemd[1]: var-lib-containers-storage-overlay-64fcf8e8c97ba8825e177563d0af97e0ea0a7b8199dbf430b56774574731b73f-merged.mount: Deactivated successfully.
Dec 03 00:27:10 compute-1 podman[222150]: 2025-12-03 00:27:10.574906757 +0000 UTC m=+0.061979589 container cleanup c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.577 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 systemd[1]: libpod-conmon-c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042.scope: Deactivated successfully.
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.581 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 podman[222157]: 2025-12-03 00:27:10.597904923 +0000 UTC m=+0.068192189 container remove c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.603 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[6b121e0c-09c8-4d9b-9b28-a57d4c5f1bd2]: (4, ("Wed Dec  3 12:27:10 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042)\nc8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042\nWed Dec  3 12:27:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (c8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042)\nc8b976c4d51c14e575d4bdc7a4cd6bc0ef5233e78b9d64e0eb3cf64e29962042\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.604 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[88281898-193d-403a-a289-d8239b39df70]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.604 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.605 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[c03318ac-6a27-4dc7-9f98-cdea4eb169e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.605 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.607 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 kernel: tapf7a76663-50: left promiscuous mode
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.614 187161 INFO nova.virt.libvirt.driver [-] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance destroyed successfully.
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.614 187161 DEBUG nova.objects.instance [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'resources' on Instance uuid 20d06540-44a6-4c4c-ab2f-d4997af86fa0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.621 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 nova_compute[187157]: 2025-12-03 00:27:10.622 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.624 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc73539-59be-4f4f-95bf-5132a01a74be]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.640 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e9acf3ea-2d53-4fec-a95c-b2814e314548]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.641 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[5598edd0-b87c-4778-bf96-95179c0f91ce]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.654 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b821672b-ab5d-48c8-b572-c1964a8b93cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558013, 'reachable_time': 21948, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222201, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.656 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:27:10 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:10.656 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[ce09201b-0b1a-4936-ad4b-fed1179e823f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:10 compute-1 systemd[1]: run-netns-ovnmeta\x2df7a76663\x2d52a3\x2d4e8c\x2daf8a\x2d8ef26c8fecf2.mount: Deactivated successfully.
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.121 187161 DEBUG nova.virt.libvirt.vif [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-03T00:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1695178384',id=32,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:25:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-sxcn790n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:26:56Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.121 187161 DEBUG nova.network.os_vif_util [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.122 187161 DEBUG nova.network.os_vif_util [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.122 187161 DEBUG os_vif [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.124 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.124 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap697e2ff1-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.125 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.127 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.128 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.128 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3211cfe7-779a-44b8-886d-4f1ae3c3a89e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.129 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.130 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.131 187161 INFO os_vif [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39')
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.132 187161 INFO nova.virt.libvirt.driver [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Deleting instance files /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0_del
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.133 187161 INFO nova.virt.libvirt.driver [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Deletion of /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0_del complete
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.320 187161 DEBUG nova.compute.manager [req-67e5ce2d-b49d-46e2-b7ff-f977b67a7648 req-366789d1-552a-4a49-a2ff-199fd6d15925 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.321 187161 DEBUG oslo_concurrency.lockutils [req-67e5ce2d-b49d-46e2-b7ff-f977b67a7648 req-366789d1-552a-4a49-a2ff-199fd6d15925 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.321 187161 DEBUG oslo_concurrency.lockutils [req-67e5ce2d-b49d-46e2-b7ff-f977b67a7648 req-366789d1-552a-4a49-a2ff-199fd6d15925 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.321 187161 DEBUG oslo_concurrency.lockutils [req-67e5ce2d-b49d-46e2-b7ff-f977b67a7648 req-366789d1-552a-4a49-a2ff-199fd6d15925 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.321 187161 DEBUG nova.compute.manager [req-67e5ce2d-b49d-46e2-b7ff-f977b67a7648 req-366789d1-552a-4a49-a2ff-199fd6d15925 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.322 187161 DEBUG nova.compute.manager [req-67e5ce2d-b49d-46e2-b7ff-f977b67a7648 req-366789d1-552a-4a49-a2ff-199fd6d15925 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.646 187161 INFO nova.compute.manager [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.647 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.648 187161 DEBUG nova.compute.manager [-] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.648 187161 DEBUG nova.network.neutron [-] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:27:11 compute-1 nova_compute[187157]: 2025-12-03 00:27:11.649 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:27:12 compute-1 nova_compute[187157]: 2025-12-03 00:27:12.204 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:27:12 compute-1 nova_compute[187157]: 2025-12-03 00:27:12.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.276 187161 DEBUG nova.compute.manager [req-55918a79-4237-48dd-9dd6-4b510ad45c93 req-37855df5-a98f-4b97-a6a8-e61b3e6d01f1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-deleted-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.277 187161 INFO nova.compute.manager [req-55918a79-4237-48dd-9dd6-4b510ad45c93 req-37855df5-a98f-4b97-a6a8-e61b3e6d01f1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Neutron deleted interface 697e2ff1-393b-4c81-abc1-b7afc93f0e5b; detaching it from the instance and deleting it from the info cache
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.277 187161 DEBUG nova.network.neutron [req-55918a79-4237-48dd-9dd6-4b510ad45c93 req-37855df5-a98f-4b97-a6a8-e61b3e6d01f1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.396 187161 DEBUG nova.compute.manager [req-ca9a8d10-f522-4f4d-a345-2bd8002a700e req-3158ccc8-366b-4140-a11c-9bd00d2dffd5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.397 187161 DEBUG oslo_concurrency.lockutils [req-ca9a8d10-f522-4f4d-a345-2bd8002a700e req-3158ccc8-366b-4140-a11c-9bd00d2dffd5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.397 187161 DEBUG oslo_concurrency.lockutils [req-ca9a8d10-f522-4f4d-a345-2bd8002a700e req-3158ccc8-366b-4140-a11c-9bd00d2dffd5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.398 187161 DEBUG oslo_concurrency.lockutils [req-ca9a8d10-f522-4f4d-a345-2bd8002a700e req-3158ccc8-366b-4140-a11c-9bd00d2dffd5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.398 187161 DEBUG nova.compute.manager [req-ca9a8d10-f522-4f4d-a345-2bd8002a700e req-3158ccc8-366b-4140-a11c-9bd00d2dffd5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.398 187161 DEBUG nova.compute.manager [req-ca9a8d10-f522-4f4d-a345-2bd8002a700e req-3158ccc8-366b-4140-a11c-9bd00d2dffd5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.730 187161 DEBUG nova.network.neutron [-] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:27:13 compute-1 nova_compute[187157]: 2025-12-03 00:27:13.786 187161 DEBUG nova.compute.manager [req-55918a79-4237-48dd-9dd6-4b510ad45c93 req-37855df5-a98f-4b97-a6a8-e61b3e6d01f1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Detach interface failed, port_id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b, reason: Instance 20d06540-44a6-4c4c-ab2f-d4997af86fa0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:27:14 compute-1 nova_compute[187157]: 2025-12-03 00:27:14.236 187161 INFO nova.compute.manager [-] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Took 2.59 seconds to deallocate network for instance.
Dec 03 00:27:14 compute-1 nova_compute[187157]: 2025-12-03 00:27:14.758 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:14 compute-1 nova_compute[187157]: 2025-12-03 00:27:14.758 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:14 compute-1 nova_compute[187157]: 2025-12-03 00:27:14.839 187161 DEBUG nova.compute.provider_tree [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:27:15 compute-1 nova_compute[187157]: 2025-12-03 00:27:15.350 187161 DEBUG nova.scheduler.client.report [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:27:15 compute-1 nova_compute[187157]: 2025-12-03 00:27:15.642 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:15 compute-1 nova_compute[187157]: 2025-12-03 00:27:15.861 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:15 compute-1 nova_compute[187157]: 2025-12-03 00:27:15.883 187161 INFO nova.scheduler.client.report [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Deleted allocations for instance 20d06540-44a6-4c4c-ab2f-d4997af86fa0
Dec 03 00:27:16 compute-1 nova_compute[187157]: 2025-12-03 00:27:16.129 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:16 compute-1 nova_compute[187157]: 2025-12-03 00:27:16.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:17 compute-1 nova_compute[187157]: 2025-12-03 00:27:17.084 187161 DEBUG oslo_concurrency.lockutils [None req-b03d7f58-8a61-4976-b0e6-de1af168b8d8 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.267s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: ERROR   00:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: ERROR   00:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: ERROR   00:27:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: ERROR   00:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: ERROR   00:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:27:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:27:20 compute-1 podman[222202]: 2025-12-03 00:27:20.207554844 +0000 UTC m=+0.052979701 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:27:20 compute-1 nova_compute[187157]: 2025-12-03 00:27:20.643 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:21 compute-1 nova_compute[187157]: 2025-12-03 00:27:21.131 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:23 compute-1 podman[222226]: 2025-12-03 00:27:23.215225627 +0000 UTC m=+0.053540354 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:27:23 compute-1 podman[222237]: 2025-12-03 00:27:23.283072107 +0000 UTC m=+0.080599029 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 03 00:27:25 compute-1 nova_compute[187157]: 2025-12-03 00:27:25.691 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:26 compute-1 nova_compute[187157]: 2025-12-03 00:27:26.133 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:30 compute-1 nova_compute[187157]: 2025-12-03 00:27:30.693 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:31 compute-1 nova_compute[187157]: 2025-12-03 00:27:31.135 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:34 compute-1 sshd-session[222271]: Received disconnect from 193.46.255.20 port 63108:11:  [preauth]
Dec 03 00:27:34 compute-1 sshd-session[222271]: Disconnected from authenticating user root 193.46.255.20 port 63108 [preauth]
Dec 03 00:27:35 compute-1 podman[197537]: time="2025-12-03T00:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:27:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:27:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2615 "" "Go-http-client/1.1"
Dec 03 00:27:35 compute-1 nova_compute[187157]: 2025-12-03 00:27:35.764 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:36 compute-1 nova_compute[187157]: 2025-12-03 00:27:36.137 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:37 compute-1 nova_compute[187157]: 2025-12-03 00:27:37.308 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:37 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:37.527 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:37 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:37.528 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:27:37 compute-1 nova_compute[187157]: 2025-12-03 00:27:37.528 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:38 compute-1 podman[222274]: 2025-12-03 00:27:38.240086884 +0000 UTC m=+0.076079839 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:27:40 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:40.529 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:40 compute-1 nova_compute[187157]: 2025-12-03 00:27:40.767 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:41 compute-1 nova_compute[187157]: 2025-12-03 00:27:41.139 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:41 compute-1 podman[222294]: 2025-12-03 00:27:41.213259853 +0000 UTC m=+0.049360455 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:27:45 compute-1 nova_compute[187157]: 2025-12-03 00:27:45.769 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:46 compute-1 nova_compute[187157]: 2025-12-03 00:27:46.140 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:48.096 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:2e:d3 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5bcb6274878430cbf268fcd97e3d9d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41d502de-899a-45f5-a018-49c03d644872, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=aeb951c1-76c1-4a80-a37e-114fc110daf0) old=Port_Binding(mac=['fa:16:3e:5f:2e:d3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5bcb6274878430cbf268fcd97e3d9d5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:48.098 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port aeb951c1-76c1-4a80-a37e-114fc110daf0 in datapath 47c9dea6-51f8-4918-b7de-0893eb139352 updated
Dec 03 00:27:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:48.098 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47c9dea6-51f8-4918-b7de-0893eb139352, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:27:48 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:48.099 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[723e6ca3-27c0-41c9-b5b2-017582704207]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: ERROR   00:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: ERROR   00:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: ERROR   00:27:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: ERROR   00:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: ERROR   00:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:27:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:27:50 compute-1 nova_compute[187157]: 2025-12-03 00:27:50.771 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:51 compute-1 nova_compute[187157]: 2025-12-03 00:27:51.142 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:51 compute-1 podman[222316]: 2025-12-03 00:27:51.211576098 +0000 UTC m=+0.050193753 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:27:54 compute-1 podman[222340]: 2025-12-03 00:27:54.267728872 +0000 UTC m=+0.101363860 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 03 00:27:54 compute-1 podman[222341]: 2025-12-03 00:27:54.292313547 +0000 UTC m=+0.129518122 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:27:54 compute-1 nova_compute[187157]: 2025-12-03 00:27:54.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:55 compute-1 nova_compute[187157]: 2025-12-03 00:27:55.855 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:56 compute-1 nova_compute[187157]: 2025-12-03 00:27:56.144 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:57 compute-1 nova_compute[187157]: 2025-12-03 00:27:57.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:58.415 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:3f:d2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '079699d388d64224949dbfaf77fa93bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3a59f07-3b05-4b11-8a33-06a5d4e97331, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0b93d847-3f88-4d4a-9d9c-eebaaf22c0f2) old=Port_Binding(mac=['fa:16:3e:e2:3f:d2'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '079699d388d64224949dbfaf77fa93bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:58.416 104348 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0b93d847-3f88-4d4a-9d9c-eebaaf22c0f2 in datapath 2f8079be-7802-4cbd-9c9c-c0cb589fc871 updated
Dec 03 00:27:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:58.417 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f8079be-7802-4cbd-9c9c-c0cb589fc871, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:27:58 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:27:58.417 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ef73c463-1aea-4430-8e92-1fcbda05f304]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:58 compute-1 nova_compute[187157]: 2025-12-03 00:27:58.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:59 compute-1 nova_compute[187157]: 2025-12-03 00:27:59.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:00 compute-1 nova_compute[187157]: 2025-12-03 00:28:00.859 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:01 compute-1 nova_compute[187157]: 2025-12-03 00:28:01.146 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:01.756 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:01.757 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:01.757 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.206 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.729 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.730 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.730 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.730 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:28:02 compute-1 sshd-session[222386]: Invalid user solv from 193.32.162.146 port 34386
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.885 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.886 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.903 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.904 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5837MB free_disk=73.1610107421875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.904 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:02 compute-1 nova_compute[187157]: 2025-12-03 00:28:02.904 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:02 compute-1 sshd-session[222386]: Connection closed by invalid user solv 193.32.162.146 port 34386 [preauth]
Dec 03 00:28:03 compute-1 nova_compute[187157]: 2025-12-03 00:28:03.956 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:28:03 compute-1 nova_compute[187157]: 2025-12-03 00:28:03.957 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:28:02 up  1:35,  0 user,  load average: 0.19, 0.23, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:28:03 compute-1 nova_compute[187157]: 2025-12-03 00:28:03.980 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:28:04 compute-1 nova_compute[187157]: 2025-12-03 00:28:04.488 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:28:04 compute-1 nova_compute[187157]: 2025-12-03 00:28:04.996 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:28:04 compute-1 nova_compute[187157]: 2025-12-03 00:28:04.997 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:05 compute-1 podman[197537]: time="2025-12-03T00:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:28:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:28:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:28:05 compute-1 nova_compute[187157]: 2025-12-03 00:28:05.861 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:06 compute-1 nova_compute[187157]: 2025-12-03 00:28:06.149 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:07 compute-1 nova_compute[187157]: 2025-12-03 00:28:07.487 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:07 compute-1 nova_compute[187157]: 2025-12-03 00:28:07.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:07 compute-1 nova_compute[187157]: 2025-12-03 00:28:07.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:28:09 compute-1 podman[222390]: 2025-12-03 00:28:09.209168482 +0000 UTC m=+0.057080830 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:28:09 compute-1 nova_compute[187157]: 2025-12-03 00:28:09.383 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:09 compute-1 nova_compute[187157]: 2025-12-03 00:28:09.383 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:09 compute-1 nova_compute[187157]: 2025-12-03 00:28:09.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:09 compute-1 nova_compute[187157]: 2025-12-03 00:28:09.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:09 compute-1 nova_compute[187157]: 2025-12-03 00:28:09.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:28:09 compute-1 nova_compute[187157]: 2025-12-03 00:28:09.889 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:28:10 compute-1 nova_compute[187157]: 2025-12-03 00:28:10.445 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:10 compute-1 nova_compute[187157]: 2025-12-03 00:28:10.446 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:10 compute-1 nova_compute[187157]: 2025-12-03 00:28:10.453 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:28:10 compute-1 nova_compute[187157]: 2025-12-03 00:28:10.453 187161 INFO nova.compute.claims [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Claim successful on node compute-1.ctlplane.example.com
Dec 03 00:28:10 compute-1 ovn_controller[95464]: 2025-12-03T00:28:10Z|00307|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:28:10 compute-1 nova_compute[187157]: 2025-12-03 00:28:10.861 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:11 compute-1 nova_compute[187157]: 2025-12-03 00:28:11.150 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:11 compute-1 nova_compute[187157]: 2025-12-03 00:28:11.522 187161 DEBUG nova.compute.provider_tree [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:28:12 compute-1 nova_compute[187157]: 2025-12-03 00:28:12.032 187161 DEBUG nova.scheduler.client.report [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:28:12 compute-1 podman[222411]: 2025-12-03 00:28:12.213252957 +0000 UTC m=+0.060411570 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:28:12 compute-1 nova_compute[187157]: 2025-12-03 00:28:12.543 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:12 compute-1 nova_compute[187157]: 2025-12-03 00:28:12.544 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:28:13 compute-1 nova_compute[187157]: 2025-12-03 00:28:13.060 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:28:13 compute-1 nova_compute[187157]: 2025-12-03 00:28:13.061 187161 DEBUG nova.network.neutron [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:28:13 compute-1 nova_compute[187157]: 2025-12-03 00:28:13.061 187161 WARNING neutronclient.v2_0.client [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:28:13 compute-1 nova_compute[187157]: 2025-12-03 00:28:13.061 187161 WARNING neutronclient.v2_0.client [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:28:13 compute-1 nova_compute[187157]: 2025-12-03 00:28:13.567 187161 INFO nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:28:13 compute-1 nova_compute[187157]: 2025-12-03 00:28:13.778 187161 DEBUG nova.network.neutron [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Successfully created port: 21b5b048-03fd-4cce-b80e-426f2c35c56a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.076 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.209 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.362 187161 DEBUG nova.network.neutron [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Successfully updated port: 21b5b048-03fd-4cce-b80e-426f2c35c56a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.447 187161 DEBUG nova.compute.manager [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-changed-21b5b048-03fd-4cce-b80e-426f2c35c56a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.447 187161 DEBUG nova.compute.manager [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Refreshing instance network info cache due to event network-changed-21b5b048-03fd-4cce-b80e-426f2c35c56a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.448 187161 DEBUG oslo_concurrency.lockutils [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-1c32f4c5-c959-44c9-be90-2e0a08b52619" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.448 187161 DEBUG oslo_concurrency.lockutils [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-1c32f4c5-c959-44c9-be90-2e0a08b52619" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.448 187161 DEBUG nova.network.neutron [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Refreshing network info cache for port 21b5b048-03fd-4cce-b80e-426f2c35c56a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.867 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "refresh_cache-1c32f4c5-c959-44c9-be90-2e0a08b52619" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:28:14 compute-1 nova_compute[187157]: 2025-12-03 00:28:14.953 187161 WARNING neutronclient.v2_0.client [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.041 187161 DEBUG nova.network.neutron [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.093 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.094 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.095 187161 INFO nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Creating image(s)
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.095 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "/var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.095 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "/var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.096 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "/var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.096 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.099 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.101 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.148 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.148 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.149 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.149 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.152 187161 DEBUG oslo_utils.imageutils.format_inspector [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.152 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.160 187161 DEBUG nova.network.neutron [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.202 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.202 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.210 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.235 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.236 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.087s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.236 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.287 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.288 187161 DEBUG nova.virt.disk.api [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Checking if we can resize image /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.288 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.341 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.342 187161 DEBUG nova.virt.disk.api [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Cannot resize image /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.342 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.342 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Ensure instance console log exists: /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.343 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.343 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.343 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.667 187161 DEBUG oslo_concurrency.lockutils [req-1192ac20-2f19-4b82-b4d0-32d41fd9acfd req-b3218d72-21c8-440c-a542-b606252713c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-1c32f4c5-c959-44c9-be90-2e0a08b52619" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.668 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquired lock "refresh_cache-1c32f4c5-c959-44c9-be90-2e0a08b52619" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.668 187161 DEBUG nova.network.neutron [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:28:15 compute-1 nova_compute[187157]: 2025-12-03 00:28:15.863 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:16 compute-1 nova_compute[187157]: 2025-12-03 00:28:16.152 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:16 compute-1 nova_compute[187157]: 2025-12-03 00:28:16.398 187161 DEBUG nova.network.neutron [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:28:16 compute-1 nova_compute[187157]: 2025-12-03 00:28:16.642 187161 WARNING neutronclient.v2_0.client [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:28:16 compute-1 nova_compute[187157]: 2025-12-03 00:28:16.807 187161 DEBUG nova.network.neutron [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Updating instance_info_cache with network_info: [{"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.312 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Releasing lock "refresh_cache-1c32f4c5-c959-44c9-be90-2e0a08b52619" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.313 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Instance network_info: |[{"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.317 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Start _get_guest_xml network_info=[{"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.321 187161 WARNING nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.322 187161 DEBUG nova.virt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1212261301', uuid='1c32f4c5-c959-44c9-be90-2e0a08b52619'), owner=OwnerMeta(userid='bc59879cc7d442cb9c60a8c6aebf4e24', username='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898-project-admin', projectid='079699d388d64224949dbfaf77fa93bd', projectname='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721697.3225658) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.326 187161 DEBUG nova.virt.libvirt.host [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.327 187161 DEBUG nova.virt.libvirt.host [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.330 187161 DEBUG nova.virt.libvirt.host [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.331 187161 DEBUG nova.virt.libvirt.host [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.333 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.333 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.334 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.334 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.335 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.335 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.336 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.336 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.337 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.337 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.338 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.338 187161 DEBUG nova.virt.hardware [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.346 187161 DEBUG nova.virt.libvirt.vif [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:28:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1212261301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-121226130',id=34,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='079699d388d64224949dbfaf77fa93bd',ramdisk_id='',reservation_id='r-85br1zms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:28:14Z,user_data=None,user_id='bc59879cc7d442cb9c60a8c6aebf4e24',uuid=1c32f4c5-c959-44c9-be90-2e0a08b52619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.347 187161 DEBUG nova.network.os_vif_util [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Converting VIF {"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.349 187161 DEBUG nova.network.os_vif_util [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.350 187161 DEBUG nova.objects.instance [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c32f4c5-c959-44c9-be90-2e0a08b52619 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.860 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <uuid>1c32f4c5-c959-44c9-be90-2e0a08b52619</uuid>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <name>instance-00000022</name>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <memory>131072</memory>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <vcpu>1</vcpu>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <metadata>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:name>tempest-TestExecuteZoneMigrationStrategyVolume-server-1212261301</nova:name>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:creationTime>2025-12-03 00:28:17</nova:creationTime>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:memory>128</nova:memory>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:disk>1</nova:disk>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:swap>0</nova:swap>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:extraSpecs>
Dec 03 00:28:17 compute-1 nova_compute[187157]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         </nova:extraSpecs>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       </nova:flavor>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:minRam>0</nova:minRam>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:properties>
Dec 03 00:28:17 compute-1 nova_compute[187157]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         </nova:properties>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       </nova:image>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:owner>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:user uuid="bc59879cc7d442cb9c60a8c6aebf4e24">tempest-TestExecuteZoneMigrationStrategyVolume-1776646898-project-admin</nova:user>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:project uuid="079699d388d64224949dbfaf77fa93bd">tempest-TestExecuteZoneMigrationStrategyVolume-1776646898</nova:project>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       </nova:owner>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <nova:ports>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         <nova:port uuid="21b5b048-03fd-4cce-b80e-426f2c35c56a">
Dec 03 00:28:17 compute-1 nova_compute[187157]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:         </nova:port>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       </nova:ports>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </nova:instance>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </metadata>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <sysinfo type="smbios">
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <system>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <entry name="serial">1c32f4c5-c959-44c9-be90-2e0a08b52619</entry>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <entry name="uuid">1c32f4c5-c959-44c9-be90-2e0a08b52619</entry>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </system>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </sysinfo>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <os>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <boot dev="hd"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <smbios mode="sysinfo"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </os>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <features>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <acpi/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <apic/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <vmcoreinfo/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </features>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <clock offset="utc">
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <timer name="hpet" present="no"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </clock>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <cpu mode="custom" match="exact">
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <model>Nehalem</model>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </cpu>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   <devices>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <disk type="file" device="disk">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <target dev="vda" bus="virtio"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <disk type="file" device="cdrom">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <source file="/var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.config"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <target dev="sda" bus="sata"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </disk>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <interface type="ethernet">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <mac address="fa:16:3e:9b:03:35"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <mtu size="1442"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <target dev="tap21b5b048-03"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </interface>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <serial type="pty">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <log file="/var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/console.log" append="off"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </serial>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <video>
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <model type="virtio"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </video>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <input type="tablet" bus="usb"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <rng model="virtio">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </rng>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <controller type="usb" index="0"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:28:17 compute-1 nova_compute[187157]:       <stats period="10"/>
Dec 03 00:28:17 compute-1 nova_compute[187157]:     </memballoon>
Dec 03 00:28:17 compute-1 nova_compute[187157]:   </devices>
Dec 03 00:28:17 compute-1 nova_compute[187157]: </domain>
Dec 03 00:28:17 compute-1 nova_compute[187157]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.863 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Preparing to wait for external event network-vif-plugged-21b5b048-03fd-4cce-b80e-426f2c35c56a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.863 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.863 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.864 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.865 187161 DEBUG nova.virt.libvirt.vif [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-03T00:28:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1212261301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-121226130',id=34,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='079699d388d64224949dbfaf77fa93bd',ramdisk_id='',reservation_id='r-85br1zms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:28:14Z,user_data=None,user_id='bc59879cc7d442cb9c60a8c6aebf4e24',uuid=1c32f4c5-c959-44c9-be90-2e0a08b52619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.865 187161 DEBUG nova.network.os_vif_util [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Converting VIF {"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.866 187161 DEBUG nova.network.os_vif_util [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.866 187161 DEBUG os_vif [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.867 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.867 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.867 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.868 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.868 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3b1af904-2e7a-5062-a423-b7b5a26d1091', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.869 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.871 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.872 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.872 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21b5b048-03, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.873 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap21b5b048-03, col_values=(('qos', UUID('17b412c3-63f6-43eb-a173-c3a7b0cc96e5')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.873 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap21b5b048-03, col_values=(('external_ids', {'iface-id': '21b5b048-03fd-4cce-b80e-426f2c35c56a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:03:35', 'vm-uuid': '1c32f4c5-c959-44c9-be90-2e0a08b52619'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.874 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 NetworkManager[55553]: <info>  [1764721697.8752] manager: (tap21b5b048-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.876 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.881 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-1 nova_compute[187157]: 2025-12-03 00:28:17.882 187161 INFO os_vif [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03')
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: ERROR   00:28:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: ERROR   00:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: ERROR   00:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:19 compute-1 nova_compute[187157]: 2025-12-03 00:28:19.420 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:28:19 compute-1 nova_compute[187157]: 2025-12-03 00:28:19.421 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:28:19 compute-1 nova_compute[187157]: 2025-12-03 00:28:19.421 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No VIF found with MAC fa:16:3e:9b:03:35, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:28:19 compute-1 nova_compute[187157]: 2025-12-03 00:28:19.421 187161 INFO nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Using config drive
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: ERROR   00:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: ERROR   00:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:28:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:28:19 compute-1 nova_compute[187157]: 2025-12-03 00:28:19.931 187161 WARNING neutronclient.v2_0.client [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.221 187161 INFO nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Creating config drive at /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.config
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.228 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpinbi367e execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.356 187161 DEBUG oslo_concurrency.processutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpinbi367e" returned: 0 in 0.128s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:20 compute-1 kernel: tap21b5b048-03: entered promiscuous mode
Dec 03 00:28:20 compute-1 NetworkManager[55553]: <info>  [1764721700.4258] manager: (tap21b5b048-03): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Dec 03 00:28:20 compute-1 ovn_controller[95464]: 2025-12-03T00:28:20Z|00308|binding|INFO|Claiming lport 21b5b048-03fd-4cce-b80e-426f2c35c56a for this chassis.
Dec 03 00:28:20 compute-1 ovn_controller[95464]: 2025-12-03T00:28:20Z|00309|binding|INFO|21b5b048-03fd-4cce-b80e-426f2c35c56a: Claiming fa:16:3e:9b:03:35 10.100.0.8
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.425 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.429 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.437 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.454 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:03:35 10.100.0.8'], port_security=['fa:16:3e:9b:03:35 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1c32f4c5-c959-44c9-be90-2e0a08b52619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '079699d388d64224949dbfaf77fa93bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'de55cd34-9754-4d67-ad85-e10a26bc577b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41d502de-899a-45f5-a018-49c03d644872, chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=21b5b048-03fd-4cce-b80e-426f2c35c56a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.455 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 21b5b048-03fd-4cce-b80e-426f2c35c56a in datapath 47c9dea6-51f8-4918-b7de-0893eb139352 bound to our chassis
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.457 104348 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 47c9dea6-51f8-4918-b7de-0893eb139352
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.474 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[b58743ea-43ce-4851-9073-502bd731f297]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.475 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap47c9dea6-51 in ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.477 207957 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap47c9dea6-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.477 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[66dc68e1-ce4e-4c97-ad50-3ab6c7056aaf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.478 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[62fb2ad4-6fcb-48c6-9b6b-9014813dc221]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 systemd-udevd[222468]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:28:20 compute-1 systemd-machined[153454]: New machine qemu-29-instance-00000022.
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.490 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e977a8-cd5a-4874-87b4-8c2b2c96acb9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 NetworkManager[55553]: <info>  [1764721700.5052] device (tap21b5b048-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:28:20 compute-1 NetworkManager[55553]: <info>  [1764721700.5061] device (tap21b5b048-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.520 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[17c9eb3a-1d91-42ff-a3ff-226f720fc32a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 systemd[1]: Started Virtual Machine qemu-29-instance-00000022.
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.529 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 ovn_controller[95464]: 2025-12-03T00:28:20Z|00310|binding|INFO|Setting lport 21b5b048-03fd-4cce-b80e-426f2c35c56a ovn-installed in OVS
Dec 03 00:28:20 compute-1 ovn_controller[95464]: 2025-12-03T00:28:20Z|00311|binding|INFO|Setting lport 21b5b048-03fd-4cce-b80e-426f2c35c56a up in Southbound
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.531 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.557 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[95fc389e-b28a-497c-bdcc-b356c54909c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.563 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e415b70e-782c-4a94-b323-92d48b66ae6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 NetworkManager[55553]: <info>  [1764721700.5640] manager: (tap47c9dea6-50): new Veth device (/org/freedesktop/NetworkManager/Devices/108)
Dec 03 00:28:20 compute-1 systemd-udevd[222472]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.588 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[69084eaf-7e2e-47a4-8042-7666454374e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.597 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac40fc0-970c-43d4-a57b-493ce21d180e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 NetworkManager[55553]: <info>  [1764721700.6194] device (tap47c9dea6-50): carrier: link connected
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.625 209338 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a8e12d-3d7f-4878-96c5-2e323348b395]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.640 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[aac5c90c-3899-4802-96c3-0fceeb4d4efb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap47c9dea6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:2e:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572027, 'reachable_time': 40957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222500, 'error': None, 'target': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.657 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[8b198f18-4fa3-47ce-b091-6518a3f18511]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:2ed3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572027, 'tstamp': 572027}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222501, 'error': None, 'target': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.673 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[ac50b0e4-ddc7-42e9-9a22-825acc65df63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap47c9dea6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:2e:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572027, 'reachable_time': 40957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222502, 'error': None, 'target': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.705 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[a34a56ba-84c6-45a1-bebe-353f462384ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.762 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[32b04325-06e5-4fd2-8c87-62be9ab4be78]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.763 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47c9dea6-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.763 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.764 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47c9dea6-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:20 compute-1 NetworkManager[55553]: <info>  [1764721700.7665] manager: (tap47c9dea6-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Dec 03 00:28:20 compute-1 kernel: tap47c9dea6-50: entered promiscuous mode
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.767 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.772 187161 DEBUG nova.compute.manager [req-503d1e42-9935-406b-85b1-da5eb7f09c76 req-eb7b6a7b-c500-4c50-b1e9-55f67172b0bd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-plugged-21b5b048-03fd-4cce-b80e-426f2c35c56a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.773 187161 DEBUG oslo_concurrency.lockutils [req-503d1e42-9935-406b-85b1-da5eb7f09c76 req-eb7b6a7b-c500-4c50-b1e9-55f67172b0bd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.773 187161 DEBUG oslo_concurrency.lockutils [req-503d1e42-9935-406b-85b1-da5eb7f09c76 req-eb7b6a7b-c500-4c50-b1e9-55f67172b0bd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.773 187161 DEBUG oslo_concurrency.lockutils [req-503d1e42-9935-406b-85b1-da5eb7f09c76 req-eb7b6a7b-c500-4c50-b1e9-55f67172b0bd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.774 187161 DEBUG nova.compute.manager [req-503d1e42-9935-406b-85b1-da5eb7f09c76 req-eb7b6a7b-c500-4c50-b1e9-55f67172b0bd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Processing event network-vif-plugged-21b5b048-03fd-4cce-b80e-426f2c35c56a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.775 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap47c9dea6-50, col_values=(('external_ids', {'iface-id': 'aeb951c1-76c1-4a80-a37e-114fc110daf0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.776 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 ovn_controller[95464]: 2025-12-03T00:28:20Z|00312|binding|INFO|Releasing lport aeb951c1-76c1-4a80-a37e-114fc110daf0 from this chassis (sb_readonly=0)
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.803 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.804 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.805 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[33012ef0-272a-41c0-a530-1e9896166638]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.806 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.806 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.806 104348 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 47c9dea6-51f8-4918-b7de-0893eb139352 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.806 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.807 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[71fa89cb-33ef-472f-b0ca-a4be8ecbd1b0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.807 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.808 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[85e44dd1-76aa-4e80-b929-17b8165d0f31]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.808 104348 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: global
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     log         /dev/log local0 debug
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     log-tag     haproxy-metadata-proxy-47c9dea6-51f8-4918-b7de-0893eb139352
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     user        root
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     group       root
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     maxconn     1024
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     pidfile     /var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     daemon
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: defaults
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     log global
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     mode http
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     option httplog
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     option dontlognull
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     option http-server-close
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     option forwardfor
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     retries                 3
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     timeout http-request    30s
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     timeout connect         30s
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     timeout client          32s
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     timeout server          32s
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     timeout http-keep-alive 30s
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: listen listener
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     bind 169.254.169.254:80
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:     http-request add-header X-OVN-Network-ID 47c9dea6-51f8-4918-b7de-0893eb139352
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:28:20 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:28:20.810 104348 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'env', 'PROCESS_TAG=haproxy-47c9dea6-51f8-4918-b7de-0893eb139352', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/47c9dea6-51f8-4918-b7de-0893eb139352.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:28:20 compute-1 nova_compute[187157]: 2025-12-03 00:28:20.863 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:21 compute-1 podman[222534]: 2025-12-03 00:28:21.191264128 +0000 UTC m=+0.081127311 container create b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:28:21 compute-1 systemd[1]: Started libpod-conmon-b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8.scope.
Dec 03 00:28:21 compute-1 podman[222534]: 2025-12-03 00:28:21.143844083 +0000 UTC m=+0.033707256 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:28:21 compute-1 systemd[1]: Started libcrun container.
Dec 03 00:28:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8233503d943a27ff82aef839dd2ce0b83e3165669e489482cda93cc9c1c7aac8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:28:21 compute-1 podman[222534]: 2025-12-03 00:28:21.273386723 +0000 UTC m=+0.163249916 container init b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:28:21 compute-1 podman[222534]: 2025-12-03 00:28:21.281662313 +0000 UTC m=+0.171525466 container start b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 03 00:28:21 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [NOTICE]   (222566) : New worker (222574) forked
Dec 03 00:28:21 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [NOTICE]   (222566) : Loading success.
Dec 03 00:28:21 compute-1 podman[222551]: 2025-12-03 00:28:21.314523307 +0000 UTC m=+0.061828565 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.440 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.444 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.448 187161 INFO nova.virt.libvirt.driver [-] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Instance spawned successfully.
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.449 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.962 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.962 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.963 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.963 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.963 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:28:21 compute-1 nova_compute[187157]: 2025-12-03 00:28:21.964 187161 DEBUG nova.virt.libvirt.driver [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.475 187161 INFO nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Took 7.38 seconds to spawn the instance on the hypervisor.
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.475 187161 DEBUG nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.827 187161 DEBUG nova.compute.manager [req-cb0f91d3-3f82-4ba0-8fc7-7e5dd0f05222 req-d640453d-ba1a-479f-8832-78139fb3dfac 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-plugged-21b5b048-03fd-4cce-b80e-426f2c35c56a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.827 187161 DEBUG oslo_concurrency.lockutils [req-cb0f91d3-3f82-4ba0-8fc7-7e5dd0f05222 req-d640453d-ba1a-479f-8832-78139fb3dfac 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.828 187161 DEBUG oslo_concurrency.lockutils [req-cb0f91d3-3f82-4ba0-8fc7-7e5dd0f05222 req-d640453d-ba1a-479f-8832-78139fb3dfac 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.828 187161 DEBUG oslo_concurrency.lockutils [req-cb0f91d3-3f82-4ba0-8fc7-7e5dd0f05222 req-d640453d-ba1a-479f-8832-78139fb3dfac 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.828 187161 DEBUG nova.compute.manager [req-cb0f91d3-3f82-4ba0-8fc7-7e5dd0f05222 req-d640453d-ba1a-479f-8832-78139fb3dfac 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] No waiting events found dispatching network-vif-plugged-21b5b048-03fd-4cce-b80e-426f2c35c56a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.828 187161 WARNING nova.compute.manager [req-cb0f91d3-3f82-4ba0-8fc7-7e5dd0f05222 req-d640453d-ba1a-479f-8832-78139fb3dfac 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received unexpected event network-vif-plugged-21b5b048-03fd-4cce-b80e-426f2c35c56a for instance with vm_state active and task_state None.
Dec 03 00:28:22 compute-1 nova_compute[187157]: 2025-12-03 00:28:22.876 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:23 compute-1 nova_compute[187157]: 2025-12-03 00:28:23.008 187161 INFO nova.compute.manager [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Took 12.61 seconds to build instance.
Dec 03 00:28:23 compute-1 nova_compute[187157]: 2025-12-03 00:28:23.514 187161 DEBUG oslo_concurrency.lockutils [None req-7197f4c0-d40a-4888-b403-6c3399dac348 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:25 compute-1 podman[222595]: 2025-12-03 00:28:25.230180541 +0000 UTC m=+0.064917819 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Dec 03 00:28:25 compute-1 podman[222596]: 2025-12-03 00:28:25.281475461 +0000 UTC m=+0.117496800 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:28:25 compute-1 nova_compute[187157]: 2025-12-03 00:28:25.865 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:27 compute-1 nova_compute[187157]: 2025-12-03 00:28:27.880 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:28 compute-1 nova_compute[187157]: 2025-12-03 00:28:28.214 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:28 compute-1 nova_compute[187157]: 2025-12-03 00:28:28.215 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:28 compute-1 nova_compute[187157]: 2025-12-03 00:28:28.722 187161 DEBUG nova.objects.instance [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lazy-loading 'flavor' on Instance uuid 1c32f4c5-c959-44c9-be90-2e0a08b52619 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:28:30 compute-1 nova_compute[187157]: 2025-12-03 00:28:30.108 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 1.894s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:30 compute-1 nova_compute[187157]: 2025-12-03 00:28:30.866 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:30 compute-1 nova_compute[187157]: 2025-12-03 00:28:30.887 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:30 compute-1 nova_compute[187157]: 2025-12-03 00:28:30.887 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:30 compute-1 nova_compute[187157]: 2025-12-03 00:28:30.888 187161 INFO nova.compute.manager [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Attaching volume fec9baad-7985-4f1a-a1d4-0469965aa4e9 to /dev/vdb
Dec 03 00:28:30 compute-1 nova_compute[187157]: 2025-12-03 00:28:30.888 187161 DEBUG nova.objects.instance [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lazy-loading 'flavor' on Instance uuid 1c32f4c5-c959-44c9-be90-2e0a08b52619 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:28:31 compute-1 nova_compute[187157]: 2025-12-03 00:28:31.658 187161 DEBUG os_brick.utils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:177
Dec 03 00:28:31 compute-1 nova_compute[187157]: 2025-12-03 00:28:31.659 187161 INFO oslo.privsep.daemon [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmppqe9zpjo/privsep.sock']
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.384 187161 INFO oslo.privsep.daemon [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Spawned new privsep daemon via rootwrap
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.245 222644 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.248 222644 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.250 222644 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/none
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.250 222644 INFO oslo.privsep.daemon [-] privsep daemon running as pid 222644
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.386 222644 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e5667b-4f83-47f1-b35c-e344fee117f7]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.461 222644 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.470 222644 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.471 222644 DEBUG oslo.privsep.daemon [-] privsep: reply[468107f1-3473-4a19-b2b5-a632a52b57e8]: (4, ('InitiatorName=iqn.1994-05.com.redhat:225f7ee77f73', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.472 222644 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[d3c0d217-bf61-4217-8205-309ac44f5678]: [Errno 2] No such file or directory: '/dev/scini' _process_cmd /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:492
Dec 03 00:28:32 compute-1 nova_compute[187157]: Traceback (most recent call last):
Dec 03 00:28:32 compute-1 nova_compute[187157]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd
Dec 03 00:28:32 compute-1 nova_compute[187157]:     ret = func(*f_args, **f_kwargs)
Dec 03 00:28:32 compute-1 nova_compute[187157]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 03 00:28:32 compute-1 nova_compute[187157]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap
Dec 03 00:28:32 compute-1 nova_compute[187157]:     return func(*args, **kwargs)
Dec 03 00:28:32 compute-1 nova_compute[187157]:            ^^^^^^^^^^^^^^^^^^^^^
Dec 03 00:28:32 compute-1 nova_compute[187157]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid
Dec 03 00:28:32 compute-1 nova_compute[187157]:     with open_scini_device() as fd:
Dec 03 00:28:32 compute-1 nova_compute[187157]:          ^^^^^^^^^^^^^^^^^^^
Dec 03 00:28:32 compute-1 nova_compute[187157]:   File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__
Dec 03 00:28:32 compute-1 nova_compute[187157]:     return next(self.gen)
Dec 03 00:28:32 compute-1 nova_compute[187157]:            ^^^^^^^^^^^^^^
Dec 03 00:28:32 compute-1 nova_compute[187157]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device
Dec 03 00:28:32 compute-1 nova_compute[187157]:     fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)
Dec 03 00:28:32 compute-1 nova_compute[187157]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 03 00:28:32 compute-1 nova_compute[187157]: FileNotFoundError: [Errno 2] No such file or directory: '/dev/scini'
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.474 222644 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c0d217-bf61-4217-8205-309ac44f5678]: (5, 'builtins.FileNotFoundError', (2, 'No such file or directory'), 'Traceback (most recent call last):\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd\n    ret = func(*f_args, **f_kwargs)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid\n    with open_scini_device() as fd:\n         ^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__\n    return next(self.gen)\n           ^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device\n    fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nFileNotFoundError: [Errno 2] No such file or directory: \'/dev/scini\'\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.474 187161 ERROR os_brick.initiator.connectors.scaleio [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Error querying sdc guid: [Errno 2] No such file or directory: FileNotFoundError: [Errno 2] No such file or directory
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.475 187161 INFO os_brick.initiator.connectors.scaleio [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Unable to find SDC guid: Error querying sdc guid: [Errno 2] No such file or directory
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.475 222644 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.485 222644 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.485 222644 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3f492c-05d6-44de-887f-9336e4e52979]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.487 222644 DEBUG oslo.privsep.daemon [-] privsep: reply[79dbe06f-961d-43f6-9607-6c615e42f9be]: (4, '8b5693ff-2e25-45a5-bebe-492dc3141f79') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.487 187161 DEBUG oslo_concurrency.processutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.509 187161 DEBUG oslo_concurrency.processutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.511 187161 DEBUG os_brick.initiator.connectors.lightos [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:132
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.513 187161 INFO os_brick.initiator.connectors.lightos [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Current host hostNQN nqn.2014-08.org.nvmexpress:uuid:bf3e0a14-a5f8-4123-aa26-e7cad37b879a and IP(s) are ['38.102.83.74', '192.168.122.101', '172.19.0.101', '172.18.0.101', '172.17.0.101', 'fe80::4c8e:b5ff:fe68:b083', 'fe80::fc16:3eff:fe9b:335', 'fe80::a467:5dff:fe0f:1a7e'] 
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.513 187161 DEBUG os_brick.initiator.connectors.lightos [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:109
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.514 187161 DEBUG os_brick.initiator.connectors.lightos [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:bf3e0a14-a5f8-4123-aa26-e7cad37b879a dsc:  get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:112
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.514 187161 DEBUG os_brick.utils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] <== get_connector_properties: return (855ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'enforce_multipath': True, 'initiator': 'iqn.1994-05.com.redhat:225f7ee77f73', 'do_local_attach': False, 'nvme_hostid': 'bf3e0a14-a5f8-4123-aa26-e7cad37b879a', 'system uuid': '8b5693ff-2e25-45a5-bebe-492dc3141f79', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:bf3e0a14-a5f8-4123-aa26-e7cad37b879a', 'nvme_native_multipath': True, 'found_dsc': '', 'host_ips': ['38.102.83.74', '192.168.122.101', '172.19.0.101', '172.18.0.101', '172.17.0.101', 'fe80::4c8e:b5ff:fe68:b083', 'fe80::fc16:3eff:fe9b:335', 'fe80::a467:5dff:fe0f:1a7e']} trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:204
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.514 187161 DEBUG nova.virt.block_device [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Updating existing volume attachment record: a781b955-acff-43ee-8698-680b526fb9bb _volume_attach /usr/lib/python3.12/site-packages/nova/virt/block_device.py:666
Dec 03 00:28:32 compute-1 nova_compute[187157]: 2025-12-03 00:28:32.883 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:34 compute-1 nova_compute[187157]: 2025-12-03 00:28:34.661 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:34 compute-1 nova_compute[187157]: 2025-12-03 00:28:34.663 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:34 compute-1 nova_compute[187157]: 2025-12-03 00:28:34.663 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:34 compute-1 nova_compute[187157]: 2025-12-03 00:28:34.664 187161 DEBUG nova.virt.libvirt.volume.mount [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Got _HostMountState generation 0 get_state /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:91
Dec 03 00:28:34 compute-1 nova_compute[187157]: 2025-12-03 00:28:34.664 187161 DEBUG nova.virt.libvirt.volume.mount [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] _HostMountState.mount(fstype=nfs, export=172.18.0.100:/data/cinder_backend_1, vol_name=volume-fec9baad-7985-4f1a-a1d4-0469965aa4e9, /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540, options=[]) generation 0 mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:288
Dec 03 00:28:34 compute-1 nova_compute[187157]: 2025-12-03 00:28:34.665 187161 DEBUG nova.virt.libvirt.volume.mount [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Mounting /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:301
Dec 03 00:28:34 compute-1 kernel: FS-Cache: Loaded
Dec 03 00:28:34 compute-1 kernel: Key type dns_resolver registered
Dec 03 00:28:35 compute-1 kernel: NFS: Registering the id_resolver key type
Dec 03 00:28:35 compute-1 kernel: Key type id_resolver registered
Dec 03 00:28:35 compute-1 kernel: Key type id_legacy registered
Dec 03 00:28:35 compute-1 nfsrahead[222682]: setting /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 readahead to 128
Dec 03 00:28:35 compute-1 nova_compute[187157]: 2025-12-03 00:28:35.213 187161 DEBUG nova.virt.libvirt.volume.mount [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] _HostMountState.mount() for /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 completed successfully mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:334
Dec 03 00:28:35 compute-1 systemd[1]: Starting libvirt secret daemon...
Dec 03 00:28:35 compute-1 systemd[1]: Started libvirt secret daemon.
Dec 03 00:28:35 compute-1 nova_compute[187157]: 2025-12-03 00:28:35.292 187161 DEBUG nova.virt.libvirt.guest [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] attach device xml: <disk type="file" device="disk">
Dec 03 00:28:35 compute-1 nova_compute[187157]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Dec 03 00:28:35 compute-1 nova_compute[187157]:   <alias name="ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9"/>
Dec 03 00:28:35 compute-1 nova_compute[187157]:   <source file="/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540/volume-fec9baad-7985-4f1a-a1d4-0469965aa4e9"/>
Dec 03 00:28:35 compute-1 nova_compute[187157]:   <target dev="vdb" bus="virtio"/>
Dec 03 00:28:35 compute-1 nova_compute[187157]:   <serial>fec9baad-7985-4f1a-a1d4-0469965aa4e9</serial>
Dec 03 00:28:35 compute-1 nova_compute[187157]: </disk>
Dec 03 00:28:35 compute-1 nova_compute[187157]:  attach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:336
Dec 03 00:28:35 compute-1 ovn_controller[95464]: 2025-12-03T00:28:35Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:03:35 10.100.0.8
Dec 03 00:28:35 compute-1 ovn_controller[95464]: 2025-12-03T00:28:35Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:03:35 10.100.0.8
Dec 03 00:28:35 compute-1 podman[197537]: time="2025-12-03T00:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:28:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:28:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3077 "" "Go-http-client/1.1"
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ====                        Guru Meditation                         ====
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ====                            Package                             ====
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: product = OpenStack Compute
Dec 03 00:28:35 compute-1 nova_compute[187157]: vendor = RDO
Dec 03 00:28:35 compute-1 nova_compute[187157]: version = 32.1.0-0.20251105112212.710ffbb.el10
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ====                            Threads                             ====
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544231577280                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:214 in _native_thread
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `libvirt.virEventRunDefaultImpl()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/site-packages/libvirt.py:441 in virEventRunDefaultImpl
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `ret = libvirtmod.virEventRunDefaultImpl()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544239969984                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544248362688                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544256755392                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544265148096                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544273540800                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544281933504                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544432903872                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544441296576                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544449689280                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544458081984                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544466474688                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544474867392                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544483260096                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544559535808                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544567944896                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544576353984                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544584763072                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544593172160                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544601581248                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544609990336                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `msg = _reqq.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/queue.py:171 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.not_empty.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:355 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `waiter.acquire()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                  Thread #140544772275840                   ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:178 in _handler
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `cls.handle_signal(version, service_name, log_dir, None)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:217 in handle_signal
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `res = cls(version, frame).run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:266 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return super().run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:76 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return "\n".join(str(sect) for sect in self.sections)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:76 in <genexpr>
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return "\n".join(str(sect) for sect in self.sections)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:101 in __str__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.view(self.generator())`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:130 in newgen
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `res = gen()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_reports/generators/threading.py:67 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `thread_id: tm.ThreadModel(thread_id, stack)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ====                         Green Threads                          ====
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/bin/nova-compute:8 in <module>
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `sys.exit(main())`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/cmd/compute.py:62 in main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `service.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/service.py:335 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `_launcher.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:300 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `status, signo = self._wait_for_exit_or_signal()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:278 in _wait_for_exit_or_signal
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `super().wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:213 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.services.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:690 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.tg.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:368 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._wait_threads()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:343 in _wait_threads
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._perform_action_on_threads(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:270 in _perform_action_on_threads
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `action_func(x)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:344 in <lambda>
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `lambda x: x.wait(),`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:63 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.thread.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:232 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._exit_event.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: 2025-12-03 00:28:35.905 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:577 in poll
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.conn.consume(timeout=current_timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1477 in consume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.ensure(_consume,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1173 in ensure
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `ret, channel = autoretry_method()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/connection.py:556 in _ensured
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return fun(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/connection.py:639 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return fun(*args, channel=channels[0], **kwargs), channels[0]`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1162 in execute_method
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `method()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1464 in _consume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.connection.drain_events(timeout=poll_timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/connection.py:341 in drain_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.transport.drain_events(self.connection, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/transport/pyamqp.py:171 in drain_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return connection.drain_events(**kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/connection.py:526 in drain_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `while not self.blocking_read(timeout):`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/connection.py:531 in blocking_read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `frame = self.transport.read_frame()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/transport.py:294 in read_frame
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `frame_header = read(7, True)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/transport.py:574 in _read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `s = recv(n - len(rbuf))  # see note above`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:196 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._call_trampolining(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:169 in _call_trampolining
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `trampoline(self,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._heartbeat_exit_event.wait(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:655 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `signaled = self._cond.wait(timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:359 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `gotit = waiter.acquire(True, timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.get_hub().switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._heartbeat_exit_event.wait(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:655 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `signaled = self._cond.wait(timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:359 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `gotit = waiter.acquire(True, timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.get_hub().switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._heartbeat_exit_event.wait(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:655 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `signaled = self._cond.wait(timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:359 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `gotit = waiter.acquire(True, timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.get_hub().switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._heartbeat_exit_event.wait(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:655 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `signaled = self._cond.wait(timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:359 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `gotit = waiter.acquire(True, timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.get_hub().switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `for msg in reader:`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `buf = self.readsock.recv(4096)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._read_trampoline()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._trampoline(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `for msg in reader:`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `buf = self.readsock.recv(4096)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._read_trampoline()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._trampoline(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `for msg in reader:`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `buf = self.readsock.recv(4096)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._read_trampoline()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._trampoline(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `for line in f:`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `data = self.read(up_to)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return _original_os.read(self._fileno, size)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.trampoline(fd, read=True)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `for line in f:`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `data = self.read(up_to)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return _original_os.read(self._fileno, size)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.trampoline(fd, read=True)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `for line in f:`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `data = self.read(up_to)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return _original_os.read(self._fileno, size)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.trampoline(fd, read=True)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_utils/excutils.py:257 in wrapper
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return infunc(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/base.py:294 in _runner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `incoming = self._poll_style_listener.poll(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/base.py:42 in wrapper
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `message = func(in_self, timeout=watch.leftover(True))`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:429 in poll
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.conn.consume(timeout=min(self._current_timeout, left))`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1477 in consume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.ensure(_consume,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1173 in ensure
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `ret, channel = autoretry_method()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/connection.py:556 in _ensured
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return fun(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/connection.py:639 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return fun(*args, channel=channels[0], **kwargs), channels[0]`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1162 in execute_method
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `method()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1464 in _consume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.connection.drain_events(timeout=poll_timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/connection.py:341 in drain_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.transport.drain_events(self.connection, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/kombu/transport/pyamqp.py:171 in drain_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return connection.drain_events(**kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/connection.py:526 in drain_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `while not self.blocking_read(timeout):`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/connection.py:531 in blocking_read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `frame = self.transport.read_frame()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/transport.py:294 in read_frame
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `frame_header = read(7, True)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/amqp/transport.py:574 in _read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `s = recv(n - len(rbuf))  # see note above`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:196 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._call_trampolining(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:169 in _call_trampolining
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `trampoline(self,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bootstrap_inner()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/threading.py:1012 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._target(*self._args, **self._kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/connection.py:108 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.poller.block()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/site-packages/ovs/poller.py:231 in block
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `events = self.poll.poll(self.timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib64/python3.12/site-packages/ovs/poller.py:137 in poll
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `rlist, wlist, xlist = select.select(self.rlist,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/select.py:80 in select
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.work.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = self.fn(*self.args, **self.kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/utils.py:584 in context_wrapper
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:225 in _dispatch_thread
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._dispatch_events()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:393 in _dispatch_events
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `_c = self._event_notify_recv.read(1)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return _original_os.read(self._fileno, size)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hubs.trampoline(fd, read=True)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.work.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = self.fn(*self.args, **self.kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/utils.py:584 in context_wrapper
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:233 in _conn_event_thread
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._dispatch_conn_event()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:239 in _dispatch_conn_event
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `handler = self._conn_event_handler_queue.get()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/queue.py:321 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return waiter.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/queue.py:140 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return get_hub().switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `func(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.work.run()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = self.fn(*self.args, **self.kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py:174 in _process_incoming
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `res = self.dispatcher.dispatch(message)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py:309 in dispatch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._do_dispatch(endpoint, method, ctxt, args)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py:229 in _do_dispatch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = func(ctxt, **new_args)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/exception_wrapper.py:63 in wrapped
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return f(self, context, *args, **kw)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/compute/utils.py:1483 in decorated_function
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return function(self, context, *args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:203 in decorated_function
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return function(self, context, *args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8098 in attach_volume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `do_attach_volume(context, instance, driver_bdm)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:415 in inner
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return f(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8093 in do_attach_volume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._attach_volume(context, instance, driver_bdm)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8112 in _attach_volume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `bdm.attach(context, instance, self.volume_api, self.driver,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:46 in wrapped
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `ret_val = method(obj, context, *args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:769 in attach
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._do_attach(context, instance, volume, volume_api,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:754 in _do_attach
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._volume_attach(context, volume, connector, instance,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:692 in _volume_attach
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `virt_driver.attach_volume(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2317 in attach_volume
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `instance.device_metadata = self._build_device_metadata(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13101 in _build_device_metadata
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `vifs = objects.VirtualInterfaceList.get_by_instance_uuid(context,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py:175 in wrapper
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = cls.indirection_api.object_class_action_versions(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py:240 in object_class_action_versions
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return cctxt.call(context, 'object_class_action_versions',`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py:180 in call
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = self.transport._send(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/transport.py:123 in _send
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._driver.send(target, ctxt, message,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:794 in send
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._send(target, ctxt, message, wait_for_reply, timeout,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:783 in _send
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = self._waiter.wait(msg_id, timeout,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:654 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `message = self.waiters.get(msg_id, timeout=timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:519 in get
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `time.sleep(0.5)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/os_brick/utils.py:49 in _sleep
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `_time_sleep(secs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:45 in sleep
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = function(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:161 in _run_loop
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._sleep(idle)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:109 in _sleep
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._abort.wait(timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_utils/eventletutils.py:178 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `event.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = function(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:161 in _run_loop
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._sleep(idle)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:109 in _sleep
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._abort.wait(timeout)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_utils/eventletutils.py:178 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `event.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = function(*args, **kwargs)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:725 in run_service
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `done.wait()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `result = hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:352 in run
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self.fire_timers(self.clock())`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:471 in fire_timers
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `timer()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/timer.py:59 in __call__
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `cb(*args, **kw)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:56 in tpool_trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `_c = _rsock.recv(1)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._read_trampoline()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `self._trampoline(`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return hub.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Dec 03 00:28:35 compute-1 nova_compute[187157]:     `return self.greenlet.switch()`
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: No Traceback!
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: No Traceback!
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: No Traceback!
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: No Traceback!
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ------                        Green Thread                        ------
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: No Traceback!
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ====                           Processes                            ====
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: Process 187161 (under 187159) [ run by: nova (42436), state: running ]
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: ====                         Configuration                          ====
Dec 03 00:28:35 compute-1 nova_compute[187157]: ========================================================================
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: api: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   compute_link_prefix = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01
Dec 03 00:28:35 compute-1 nova_compute[187157]:   dhcp_domain = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enable_instance_password = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   glance_link_prefix = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_list_cells_batch_fixed_size = 100
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_list_cells_batch_strategy = distributed
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_list_per_project_cells = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   list_records_by_skipping_down_cells = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   local_metadata_per_cell = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_limit = 1000
Dec 03 00:28:35 compute-1 nova_compute[187157]:   metadata_cache_expiration = 15
Dec 03 00:28:35 compute-1 nova_compute[187157]:   neutron_default_project_id = default
Dec 03 00:28:35 compute-1 nova_compute[187157]:   response_validation = warn
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use_neutron_default_nets = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_dynamic_connect_timeout = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_dynamic_failure_fatal = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_dynamic_read_timeout = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_dynamic_ssl_certfile = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_dynamic_targets = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_jsonfile_path = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vendordata_providers = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     StaticJSON
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: api_database: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   asyncio_connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   asyncio_slave_connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backend = sqlalchemy
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_debug = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_parameters = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_recycle_time = 3600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_trace = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_inc_retry_interval = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_max_retries = 20
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_max_retry_interval = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_retry_interval = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_overflow = 50
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_pool_size = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_retries = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   mysql_sql_mode = TRADITIONAL
Dec 03 00:28:35 compute-1 nova_compute[187157]:   mysql_wsrep_sync_wait = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pool_timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retry_interval = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   slave_connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sqlite_synchronous = True
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: barbican: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_endpoint = http://localhost/identity/v3
Dec 03 00:28:35 compute-1 nova_compute[187157]:   barbican_api_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   barbican_endpoint = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   barbican_endpoint_type = internal
Dec 03 00:28:35 compute-1 nova_compute[187157]:   barbican_region_name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   number_of_retries = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retry_delay = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   send_service_user_token = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   verify_ssl = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   verify_ssl_path = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: barbican_service_user: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_type = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: cache: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backend = oslo_cache.dict
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backend_argument = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backend_expiration_time = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   config_prefix = cache.oslo
Dec 03 00:28:35 compute-1 nova_compute[187157]:   dead_timeout = 60.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   debug_cache_backend = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enable_retry_client = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enable_socket_keepalive = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enabled = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enforce_fips_mode = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   expiration_time = 600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   hashclient_retry_attempts = 2
Dec 03 00:28:35 compute-1 nova_compute[187157]:   hashclient_retry_delay = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_dead_retry = 300
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_password = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_pool_connection_get_timeout = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_pool_flush_on_reconnect = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_pool_maxsize = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_pool_unused_timeout = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_sasl_enabled = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_servers = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     localhost:11211
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_socket_timeout = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   memcache_username = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   proxies = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_db = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_password = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_sentinel_service_name = mymaster
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_sentinels = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     localhost:26379
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_server = localhost:6379
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_socket_timeout = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   redis_username = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retry_attempts = 2
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retry_delay = 0.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   socket_keepalive_count = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   socket_keepalive_idle = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   socket_keepalive_interval = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tls_allowed_ciphers = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tls_cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tls_certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tls_enabled = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tls_keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: cinder: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_type = password
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   catalog_info = volumev3:cinderv3:internalURL
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cross_az_attach = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   debug = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   endpoint_template = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   http_retries = 3
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   os_region_name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: compute: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   consecutive_build_service_disable_threshold = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_dedicated_set = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_shared_set = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   image_type_exclude_list = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_wait_for_vif_plug = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_concurrent_disk_ops = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_disk_devices_to_attach = -1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   packing_host_numa_cells_allocation_strategy = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   provider_config_location = /etc/nova/provider_config/
Dec 03 00:28:35 compute-1 nova_compute[187157]:   resource_provider_association_refresh = 300
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sharing_providers_max_uuids_per_request = 200
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shutdown_retry_interval = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vmdk_allowed_types = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     monolithicSparse
Dec 03 00:28:35 compute-1 nova_compute[187157]:     streamOptimized
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: conductor: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   workers = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: console: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   allowed_origins = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ssl_ciphers = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ssl_minimum_version = default
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: consoleauth: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enforce_session_timeout = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   token_ttl = 600
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: cyborg: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   region-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-type = accelerator
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:35 compute-1 nova_compute[187157]:     public
Dec 03 00:28:35 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: database: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   asyncio_connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   asyncio_slave_connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backend = sqlalchemy
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_debug = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_parameters = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_recycle_time = 3600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_trace = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_inc_retry_interval = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_max_retries = 20
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_max_retry_interval = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   db_retry_interval = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_overflow = 50
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_pool_size = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_retries = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   mysql_sql_mode = TRADITIONAL
Dec 03 00:28:35 compute-1 nova_compute[187157]:   mysql_wsrep_sync_wait = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pool_timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retry_interval = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   slave_connection = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sqlite_synchronous = True
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: default: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   allow_resize_to_same_host = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   arq_binding_timeout = 300
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backdoor_port = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backdoor_socket = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   block_device_allocate_retries = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   block_device_allocate_retries_interval = 3
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cell_worker_thread_pool_size = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cert = self.pem
Dec 03 00:28:35 compute-1 nova_compute[187157]:   compute_driver = libvirt.LibvirtDriver
Dec 03 00:28:35 compute-1 nova_compute[187157]:   compute_monitors = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   config-dir = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     /etc/nova/nova.conf.d
Dec 03 00:28:35 compute-1 nova_compute[187157]:   config-file = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     /etc/nova/nova-compute.conf
Dec 03 00:28:35 compute-1 nova_compute[187157]:     /etc/nova/nova.conf
Dec 03 00:28:35 compute-1 nova_compute[187157]:   config_drive_format = iso9660
Dec 03 00:28:35 compute-1 nova_compute[187157]:   config_source = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   console_host = compute-1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   control_exchange = nova
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_allocation_ratio = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   daemon = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   debug = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_access_ip_network_name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_availability_zone = nova
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_ephemeral_format = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_green_pool_size = 1000
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_log_levels = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     amqp=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     amqplib=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     boto=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     dogpile.core.dogpile=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     glanceclient=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     iso8601=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     keystoneauth=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     keystonemiddleware=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     oslo.cache=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     oslo.messaging=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     oslo.privsep.daemon=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     oslo_messaging=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     oslo_policy=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     qpid=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     requests.packages.urllib3.connectionpool=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     requests.packages.urllib3.util.retry=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     routes.middleware=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     sqlalchemy=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     stevedore=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     suds=INFO
Dec 03 00:28:35 compute-1 nova_compute[187157]:     taskflow=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     urllib3.connectionpool=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     urllib3.util.retry=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:     websocket=WARN
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_schedule_zone = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_thread_pool_size = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   disk_allocation_ratio = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enable_new_services = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   executor_thread_pool_size = 64
Dec 03 00:28:35 compute-1 nova_compute[187157]:   fatal_deprecations = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   flat_injected = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   force_config_drive = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   force_raw_images = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   graceful_shutdown_timeout = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   heal_instance_info_cache_interval = -1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   host = compute-1.ctlplane.example.com
Dec 03 00:28:35 compute-1 nova_compute[187157]:   initial_cpu_allocation_ratio = 4.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   initial_disk_allocation_ratio = 0.9
Dec 03 00:28:35 compute-1 nova_compute[187157]:   initial_ram_allocation_ratio = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   injected_network_template = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_build_timeout = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_delete_interval = 300
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_format = [instance: %(uuid)s] 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_name_template = instance-%08x
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_usage_audit = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_usage_audit_period = month
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instance_uuid_format = [instance: %(uuid)s] 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   instances_path = /var/lib/nova/instances
Dec 03 00:28:35 compute-1 nova_compute[187157]:   internal_service_availability_zone = internal
Dec 03 00:28:35 compute-1 nova_compute[187157]:   key = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_retry_count = 30
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log-config-append = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log-date-format = %Y-%m-%d %H:%M:%S
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log-dir = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log-file = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log_color = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log_options = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log_rotate_interval = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log_rotate_interval_type = days
Dec 03 00:28:35 compute-1 nova_compute[187157]:   log_rotation_type = size
Dec 03 00:28:35 compute-1 nova_compute[187157]:   logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s
Dec 03 00:28:35 compute-1 nova_compute[187157]:   logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d
Dec 03 00:28:35 compute-1 nova_compute[187157]:   logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s
Dec 03 00:28:35 compute-1 nova_compute[187157]:   logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s
Dec 03 00:28:35 compute-1 nova_compute[187157]:   logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s
Dec 03 00:28:35 compute-1 nova_compute[187157]:   long_rpc_timeout = 1800
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_concurrent_builds = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_concurrent_live_migrations = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_concurrent_snapshots = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_local_block_devices = 3
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_logfile_count = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_logfile_size_mb = 20
Dec 03 00:28:35 compute-1 nova_compute[187157]:   maximum_instance_delete_attempts = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   migrate_max_retries = -1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   mkisofs_cmd = /usr/bin/mkisofs
Dec 03 00:28:35 compute-1 nova_compute[187157]:   my_block_storage_ip = 192.168.122.101
Dec 03 00:28:35 compute-1 nova_compute[187157]:   my_ip = 192.168.122.101
Dec 03 00:28:35 compute-1 nova_compute[187157]:   my_shared_fs_storage_ip = 192.168.122.101
Dec 03 00:28:35 compute-1 nova_compute[187157]:   network_allocate_retries = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   non_inheritable_image_properties = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     bittorrent
Dec 03 00:28:35 compute-1 nova_compute[187157]:     cache_in_nova
Dec 03 00:28:35 compute-1 nova_compute[187157]:   osapi_compute_unique_server_name_scope = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   password_length = 12
Dec 03 00:28:35 compute-1 nova_compute[187157]:   periodic_enable = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   periodic_fuzzy_delay = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pointer_model = usbtablet
Dec 03 00:28:35 compute-1 nova_compute[187157]:   preallocate_images = none
Dec 03 00:28:35 compute-1 nova_compute[187157]:   publish_errors = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pybasedir = /usr/lib/python3.12/site-packages
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ram_allocation_ratio = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rate_limit_burst = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rate_limit_except_level = CRITICAL
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rate_limit_interval = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reboot_timeout = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reclaim_instance_interval = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   record = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reimage_timeout_per_gb = 20
Dec 03 00:28:35 compute-1 nova_compute[187157]:   report_interval = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rescue_timeout = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reserved_host_cpus = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reserved_host_disk_mb = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reserved_host_memory_mb = 512
Dec 03 00:28:35 compute-1 nova_compute[187157]:   reserved_huge_pages = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   resize_confirm_window = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   resize_fs_using_block_device = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   resume_guests_state_on_host_boot = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rootwrap_config = /etc/nova/rootwrap.conf
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rpc_ping_enabled = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rpc_response_timeout = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   run_external_periodic_tasks = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   running_deleted_instance_action = reap
Dec 03 00:28:35 compute-1 nova_compute[187157]:   running_deleted_instance_poll_interval = 1800
Dec 03 00:28:35 compute-1 nova_compute[187157]:   running_deleted_instance_timeout = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   scheduler_instance_sync_interval = 120
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service_down_time = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   servicegroup_driver = db
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shell_completion = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shelved_offload_time = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shelved_poll_interval = 3600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shutdown_timeout = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   source_is_ipv6 = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ssl_only = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   state_path = /var/lib/nova
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sync_power_state_interval = 600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sync_power_state_pool_size = 1000
Dec 03 00:28:35 compute-1 nova_compute[187157]:   syslog-log-facility = LOG_USER
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tempdir = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   thread_pool_statistic_period = -1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout_nbd = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   transport_url = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]:   update_resources_interval = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use-journal = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use-json = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use-syslog = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use_cow_images = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use_rootwrap_daemon = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use_stderr = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vcpu_pin_set = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vif_plugging_is_fatal = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vif_plugging_timeout = 300
Dec 03 00:28:35 compute-1 nova_compute[187157]:   virt_mkfs = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   volume_usage_poll_interval = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   watch-log-file = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   web = /usr/share/spice-html5
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: devices: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enabled_mdev_types = 
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ephemeral_storage_encryption: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cipher = aes-xts-plain64
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_format = luks
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enabled = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   key_size = 512
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: filter_scheduler: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   aggregate_image_properties_isolation_namespace = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   aggregate_image_properties_isolation_separator = .
Dec 03 00:28:35 compute-1 nova_compute[187157]:   available_filters = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     nova.scheduler.filters.all_filters
Dec 03 00:28:35 compute-1 nova_compute[187157]:   build_failure_weight_multiplier = 1000000.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cross_cell_move_weight_multiplier = 1000000.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   disk_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enabled_filters = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     ComputeCapabilitiesFilter
Dec 03 00:28:35 compute-1 nova_compute[187157]:     ComputeFilter
Dec 03 00:28:35 compute-1 nova_compute[187157]:     ImagePropertiesFilter
Dec 03 00:28:35 compute-1 nova_compute[187157]:     ServerGroupAffinityFilter
Dec 03 00:28:35 compute-1 nova_compute[187157]:     ServerGroupAntiAffinityFilter
Dec 03 00:28:35 compute-1 nova_compute[187157]:   host_subset_size = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   hypervisor_version_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   image_properties_default_architecture = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   image_props_weight_multiplier = 0.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   image_props_weight_setting = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   io_ops_weight_multiplier = -1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   isolated_hosts = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   isolated_images = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_instances_per_host = 50
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_io_ops_per_host = 8
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_instances_weight_multiplier = 0.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pci_in_placement = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pci_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ram_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   restrict_isolated_hosts_to_isolated_images = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shuffle_best_same_weighed_hosts = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   soft_affinity_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   soft_anti_affinity_weight_multiplier = 1.0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   track_instance_changes = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   weight_classes = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     nova.scheduler.weights.all_weighers
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: glance: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   api_servers = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   debug = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   default_trusted_certificate_ids = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enable_certificate_validation = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enable_rbd_download = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_retries = 3
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_ceph_conf = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_connect_timeout = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_pool = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_user = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   region-name = regionOne
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-type = image
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:35 compute-1 nova_compute[187157]:   verify_glance_signatures = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: guestfs: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   debug = False
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: image_cache: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   manager_interval = 2400
Dec 03 00:28:35 compute-1 nova_compute[187157]:   precache_concurrency = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   remove_unused_base_images = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   remove_unused_original_minimum_age_seconds = 86400
Dec 03 00:28:35 compute-1 nova_compute[187157]:   remove_unused_resized_minimum_age_seconds = 3600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   subdirectory_name = _base
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: ironic: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   api_max_retries = 60
Dec 03 00:28:35 compute-1 nova_compute[187157]:   api_retry_interval = 2
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   auth_type = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   conductor_group = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   peer_list = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   region-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   serial_console_state_timeout = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-type = baremetal
Dec 03 00:28:35 compute-1 nova_compute[187157]:   shard = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:35 compute-1 nova_compute[187157]:     public
Dec 03 00:28:35 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: key_manager: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   backend = barbican
Dec 03 00:28:35 compute-1 nova_compute[187157]:   fixed_key = ***
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: keystone: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   region-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   service-type = identity
Dec 03 00:28:35 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:35 compute-1 nova_compute[187157]:     public
Dec 03 00:28:35 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:35 compute-1 nova_compute[187157]: libvirt: 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ceph_mount_options = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   ceph_mount_point_base = /var/lib/nova/mnt
Dec 03 00:28:35 compute-1 nova_compute[187157]:   connection_uri = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_mode = custom
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_model_extra_flags = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_models = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     Nehalem
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_power_governor_high = performance
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_power_governor_low = powersave
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_power_management = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   cpu_power_management_strategy = cpu_state
Dec 03 00:28:35 compute-1 nova_compute[187157]:   device_detach_attempts = 8
Dec 03 00:28:35 compute-1 nova_compute[187157]:   device_detach_timeout = 20
Dec 03 00:28:35 compute-1 nova_compute[187157]:   disk_cachemodes = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   disk_prefix = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   enabled_perf_events = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   file_backed_memory = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   gid_maps = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   hw_disk_discard = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   hw_machine_type = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:     x86_64=q35
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_rbd_ceph_conf = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_rbd_glance_copy_poll_interval = 15
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_rbd_glance_copy_timeout = 600
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_rbd_glance_store_name = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_rbd_pool = rbd
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_type = qcow2
Dec 03 00:28:35 compute-1 nova_compute[187157]:   images_volume_group = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   inject_key = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   inject_partition = -2
Dec 03 00:28:35 compute-1 nova_compute[187157]:   inject_password = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   iscsi_iface = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   iser_use_multipath = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_bandwidth = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_completion_timeout = 800
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_downtime = 500
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_downtime_delay = 75
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_downtime_steps = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_inbound_addr = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_permit_auto_converge = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_permit_post_copy = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_scheme = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_timeout_action = force_complete
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_tunnelled = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_uri = qemu+tls://%s/system
Dec 03 00:28:35 compute-1 nova_compute[187157]:   live_migration_with_native_tls = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   max_queues = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   mem_stats_period_seconds = 10
Dec 03 00:28:35 compute-1 nova_compute[187157]:   migration_inbound_addr = 192.168.122.101
Dec 03 00:28:35 compute-1 nova_compute[187157]:   nfs_mount_options = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   nfs_mount_point_base = /var/lib/nova/mnt
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_aoe_discover_tries = 3
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_iser_scan_tries = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_memory_encrypted_guests = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_nvme_discover_tries = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_pcie_ports = 24
Dec 03 00:28:35 compute-1 nova_compute[187157]:   num_volume_scan_tries = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   pmem_namespaces = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   quobyte_client_cfg = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   quobyte_mount_point_base = /var/lib/nova/mnt
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_connect_timeout = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_destroy_volume_retries = 12
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_destroy_volume_retry_interval = 5
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_secret_uuid = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rbd_user = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   realtime_scheduler_priority = 1
Dec 03 00:28:35 compute-1 nova_compute[187157]:   remote_filesystem_transport = ssh
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rescue_image_id = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rescue_kernel_id = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rescue_ramdisk_id = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rng_dev_path = /dev/urandom
Dec 03 00:28:35 compute-1 nova_compute[187157]:   rx_queue_size = 512
Dec 03 00:28:35 compute-1 nova_compute[187157]:   smbfs_mount_options = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   smbfs_mount_point_base = /var/lib/nova/mnt
Dec 03 00:28:35 compute-1 nova_compute[187157]:   snapshot_compression = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   snapshot_image_format = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   snapshots_directory = /var/lib/nova/instances/snapshots
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sparse_logical_volumes = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   swtpm_enabled = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   swtpm_group = tss
Dec 03 00:28:35 compute-1 nova_compute[187157]:   swtpm_user = tss
Dec 03 00:28:35 compute-1 nova_compute[187157]:   sysinfo_serial = unique
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tb_cache_size = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   tx_queue_size = 512
Dec 03 00:28:35 compute-1 nova_compute[187157]:   uid_maps = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   use_virtio_for_bridges = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   virt_type = kvm
Dec 03 00:28:35 compute-1 nova_compute[187157]:   volume_clear = zero
Dec 03 00:28:35 compute-1 nova_compute[187157]:   volume_clear_size = 0
Dec 03 00:28:35 compute-1 nova_compute[187157]:   volume_enforce_multipath = False
Dec 03 00:28:35 compute-1 nova_compute[187157]:   volume_use_multipath = True
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_cache_path = None
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_mount_group = qemu
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_mount_opts = 
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_mount_perms = 0770
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_mount_point_base = /var/lib/nova/mnt
Dec 03 00:28:35 compute-1 nova_compute[187157]:   vzstorage_mount_user = stack
Dec 03 00:28:35 compute-1 nova_compute[187157]:   wait_soft_reboot_seconds = 120
Dec 03 00:28:35 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: manila: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_type = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   region-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-type = shared-file-system
Dec 03 00:28:36 compute-1 nova_compute[187157]:   share_apply_policy_timeout = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:36 compute-1 nova_compute[187157]:     public
Dec 03 00:28:36 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: metrics: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   required = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   weight_multiplier = 1.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   weight_of_unavailable = -10000.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   weight_setting = 
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: mks: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enabled = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   mksproxy_base_url = http://127.0.0.1:6090/
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: neutron: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth-url = https://keystone-internal.openstack.svc:5000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_type = password
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default_floating_pool = nova
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   extension_sync_interval = 600
Dec 03 00:28:36 compute-1 nova_compute[187157]:   http_retries = 3
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metadata_proxy_shared_secret = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ovs_bridge = br-int
Dec 03 00:28:36 compute-1 nova_compute[187157]:   password = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   physnets = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-name = service
Dec 03 00:28:36 compute-1 nova_compute[187157]:   region-name = regionOne
Dec 03 00:28:36 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-type = network
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service_metadata_proxy = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   system-scope = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   trust-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   username = nova
Dec 03 00:28:36 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:36 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: neutron_tunnel: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   numa_nodes = 
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: notifications: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   bdms_in_notifications = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default_level = INFO
Dec 03 00:28:36 compute-1 nova_compute[187157]:   include_share_mapping = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   notification_format = both
Dec 03 00:28:36 compute-1 nova_compute[187157]:   notify_on_state_change = vm_and_task_state
Dec 03 00:28:36 compute-1 nova_compute[187157]:   versioned_notifications_topics = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     versioned_notifications
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: nova_sys_admin: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   capabilities = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     0
Dec 03 00:28:36 compute-1 nova_compute[187157]:     1
Dec 03 00:28:36 compute-1 nova_compute[187157]:     12
Dec 03 00:28:36 compute-1 nova_compute[187157]:     2
Dec 03 00:28:36 compute-1 nova_compute[187157]:     21
Dec 03 00:28:36 compute-1 nova_compute[187157]:     3
Dec 03 00:28:36 compute-1 nova_compute[187157]:   group = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   helper_command = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   log_daemon_traceback = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   logger_name = oslo_privsep.daemon
Dec 03 00:28:36 compute-1 nova_compute[187157]:   thread_pool_size = 8
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: os_brick: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   lock_path = /var/lib/nova/tmp
Dec 03 00:28:36 compute-1 nova_compute[187157]:   wait_mpath_device_attempts = 4
Dec 03 00:28:36 compute-1 nova_compute[187157]:   wait_mpath_device_interval = 1
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: os_vif_linux_bridge: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   flat_interface = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   forward_bridge_interface = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     all
Dec 03 00:28:36 compute-1 nova_compute[187157]:   iptables_bottom_regex = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   iptables_drop_action = DROP
Dec 03 00:28:36 compute-1 nova_compute[187157]:   iptables_top_regex = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   network_device_mtu = 1500
Dec 03 00:28:36 compute-1 nova_compute[187157]:   use_ipv6 = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vlan_interface = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: os_vif_ovs: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default_qos_type = linux-noop
Dec 03 00:28:36 compute-1 nova_compute[187157]:   isolate_vif = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   network_device_mtu = 1500
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ovs_vsctl_timeout = 120
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ovsdb_connection = tcp:127.0.0.1:6640
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ovsdb_interface = native
Dec 03 00:28:36 compute-1 nova_compute[187157]:   per_port_bridge = False
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_concurrency: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_process_locking = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   lock_path = /var/lib/nova/tmp
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_limit: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth-url = https://keystone-internal.openstack.svc:5000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_type = password
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint_id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint_interface = internal
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint_region_name = regionOne
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint_service_name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint_service_type = compute
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   max-version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   min-version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   password = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   region-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-type = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   system-scope = all
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   trust-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   username = nova
Dec 03 00:28:36 compute-1 nova_compute[187157]:   valid-interfaces = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_messaging_metrics: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metrics_buffer_size = 1000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metrics_enabled = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metrics_process_name = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metrics_socket_file = /var/tmp/metrics_collector.sock
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metrics_thread_stop_timeout = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_messaging_notifications: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   driver = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     messagingv2
Dec 03 00:28:36 compute-1 nova_compute[187157]:   retry = -1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   topics = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     notifications
Dec 03 00:28:36 compute-1 nova_compute[187157]:   transport_url = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_messaging_rabbit: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   amqp_auto_delete = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   amqp_durable_queues = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   conn_pool_min_size = 2
Dec 03 00:28:36 compute-1 nova_compute[187157]:   conn_pool_ttl = 1200
Dec 03 00:28:36 compute-1 nova_compute[187157]:   direct_mandatory_flag = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enable_cancel_on_failover = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   heartbeat_in_pthread = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   heartbeat_rate = 3
Dec 03 00:28:36 compute-1 nova_compute[187157]:   heartbeat_timeout_threshold = 60
Dec 03 00:28:36 compute-1 nova_compute[187157]:   hostname = compute-1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kombu_compression = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kombu_failover_strategy = round-robin
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kombu_missing_consumer_retry_timeout = 60
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kombu_reconnect_delay = 1.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kombu_reconnect_splay = 0.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   processname = nova-compute
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_ha_queues = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_interval_max = 30
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_login_method = AMQPLAIN
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_qos_prefetch_count = 0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_quorum_delivery_limit = 0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_quorum_max_memory_bytes = 0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_quorum_max_memory_length = 0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_quorum_queue = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_retry_backoff = 2
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_retry_interval = 1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_stream_fanout = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_transient_queues_ttl = 1800
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rabbit_transient_quorum_queue = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   rpc_conn_pool_size = 30
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl_ca_file = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl_cert_file = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl_enforce_fips_mode = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl_key_file = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl_version = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   use_queue_manager = False
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_middleware: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   http_basic_auth_user_file = /etc/htpasswd
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_policy: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enforce_new_defaults = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enforce_scope = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   policy_default_rule = default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   policy_dirs = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     policy.d
Dec 03 00:28:36 compute-1 nova_compute[187157]:   policy_file = policy.yaml
Dec 03 00:28:36 compute-1 nova_compute[187157]:   remote_content_type = application/x-www-form-urlencoded
Dec 03 00:28:36 compute-1 nova_compute[187157]:   remote_ssl_ca_crt_file = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   remote_ssl_client_crt_file = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   remote_ssl_client_key_file = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   remote_ssl_verify_server_crt = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   remote_timeout = 60.0
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_reports: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   file_event_handler = /var/lib/nova
Dec 03 00:28:36 compute-1 nova_compute[187157]:   file_event_handler_interval = 1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   log_dir = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: oslo_versionedobjects: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   fatal_exception_format_errors = False
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: pci: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   alias = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   device_spec = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   report_in_placement = False
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: placement: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth-url = https://keystone-internal.openstack.svc:5000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_type = password
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connect-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   endpoint-override = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   max_version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   min_version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   password = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-name = service
Dec 03 00:28:36 compute-1 nova_compute[187157]:   region-name = regionOne
Dec 03 00:28:36 compute-1 nova_compute[187157]:   retriable-status-codes = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   service-type = placement
Dec 03 00:28:36 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retries = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   status-code-retry-delay = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   system-scope = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   trust-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   username = nova
Dec 03 00:28:36 compute-1 nova_compute[187157]:   valid-interfaces = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     internal
Dec 03 00:28:36 compute-1 nova_compute[187157]:   version = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: privsep_osbrick: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   capabilities = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     2
Dec 03 00:28:36 compute-1 nova_compute[187157]:     21
Dec 03 00:28:36 compute-1 nova_compute[187157]:   group = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   helper_command = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   log_daemon_traceback = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   logger_name = os_brick.privileged
Dec 03 00:28:36 compute-1 nova_compute[187157]:   thread_pool_size = 8
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: quota: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cores = 20
Dec 03 00:28:36 compute-1 nova_compute[187157]:   count_usage_from_placement = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   driver = nova.quota.DbQuotaDriver
Dec 03 00:28:36 compute-1 nova_compute[187157]:   injected_file_content_bytes = 10240
Dec 03 00:28:36 compute-1 nova_compute[187157]:   injected_file_path_length = 255
Dec 03 00:28:36 compute-1 nova_compute[187157]:   injected_files = 5
Dec 03 00:28:36 compute-1 nova_compute[187157]:   instances = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]:   key_pairs = 100
Dec 03 00:28:36 compute-1 nova_compute[187157]:   metadata_items = 128
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ram = 51200
Dec 03 00:28:36 compute-1 nova_compute[187157]:   recheck_quota = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   server_group_members = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]:   server_groups = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]:   unified_limits_resource_list = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     servers
Dec 03 00:28:36 compute-1 nova_compute[187157]:   unified_limits_resource_strategy = require
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: scheduler: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   discover_hosts_in_cells_interval = -1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enable_isolated_aggregate_filtering = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   image_metadata_prefilter = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   limit_tenants_to_placement_aggregate = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   max_attempts = 3
Dec 03 00:28:36 compute-1 nova_compute[187157]:   max_placement_results = 1000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   placement_aggregate_required_for_tenants = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   query_placement_for_image_type_support = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   query_placement_for_routed_network_aggregates = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   workers = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: serial_console: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   base_url = ws://127.0.0.1:6083/
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enabled = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   port_range = 10000:20000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   proxyclient_address = 127.0.0.1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   serialproxy_host = 0.0.0.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   serialproxy_port = 6083
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: service_user: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth-url = https://keystone-internal.openstack.svc:5000
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_type = password
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   default-domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   domain-name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   password = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   project-name = service
Dec 03 00:28:36 compute-1 nova_compute[187157]:   send_service_user_token = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   system-scope = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   trust-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-domain-name = Default
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user-id = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   username = nova
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: spice: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   agent_enabled = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enabled = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html
Dec 03 00:28:36 compute-1 nova_compute[187157]:   html5proxy_host = 0.0.0.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   html5proxy_port = 6082
Dec 03 00:28:36 compute-1 nova_compute[187157]:   image_compression = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   jpeg_compression = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   playback_compression = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   require_secure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   server_listen = 127.0.0.1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   server_proxyclient_address = 127.0.0.1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   spice_direct_proxy_base_url = http://127.0.0.1:13002/nova
Dec 03 00:28:36 compute-1 nova_compute[187157]:   streaming_mode = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   zlib_compression = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: upgrade_levels: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   baseapi = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   compute = auto
Dec 03 00:28:36 compute-1 nova_compute[187157]:   conductor = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   scheduler = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: vault: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   approle_role_id = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   approle_secret_id = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kv_mountpoint = secret
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kv_path = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   kv_version = 2
Dec 03 00:28:36 compute-1 nova_compute[187157]:   namespace = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   root_token_id = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ssl_ca_crt_file = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = 60.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   use_ssl = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vault_url = http://127.0.0.1:8200
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: vendordata_dynamic_auth: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_section = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_type = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cafile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   certfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   collect-timing = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   keyfile = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   split-loggers = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   timeout = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: vif_plug_linux_bridge_privileged: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   capabilities = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     12
Dec 03 00:28:36 compute-1 nova_compute[187157]:   group = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   helper_command = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   log_daemon_traceback = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   logger_name = oslo_privsep.daemon
Dec 03 00:28:36 compute-1 nova_compute[187157]:   thread_pool_size = 8
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: vif_plug_ovs_privileged: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   capabilities = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     1
Dec 03 00:28:36 compute-1 nova_compute[187157]:     12
Dec 03 00:28:36 compute-1 nova_compute[187157]:   group = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   helper_command = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   log_daemon_traceback = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   logger_name = oslo_privsep.daemon
Dec 03 00:28:36 compute-1 nova_compute[187157]:   thread_pool_size = 8
Dec 03 00:28:36 compute-1 nova_compute[187157]:   user = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: vmware: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   api_retry_count = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ca_file = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cache_prefix = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cluster_name = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   connection_pool_size = 10
Dec 03 00:28:36 compute-1 nova_compute[187157]:   console_delay_seconds = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   datastore_regex = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   host_ip = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   host_password = ***
Dec 03 00:28:36 compute-1 nova_compute[187157]:   host_port = 443
Dec 03 00:28:36 compute-1 nova_compute[187157]:   host_username = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   insecure = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   integration_bridge = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   maximum_objects = 100
Dec 03 00:28:36 compute-1 nova_compute[187157]:   pbm_default_policy = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   pbm_enabled = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   pbm_wsdl_location = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   serial_log_dir = /opt/vmware/vspc
Dec 03 00:28:36 compute-1 nova_compute[187157]:   serial_port_proxy_uri = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   serial_port_service_uri = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   task_poll_interval = 0.5
Dec 03 00:28:36 compute-1 nova_compute[187157]:   use_linked_clone = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vnc_keymap = en-us
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vnc_port = 5900
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vnc_port_total = 10000
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: vnc: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   auth_schemes = 
Dec 03 00:28:36 compute-1 nova_compute[187157]:     none
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enabled = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   novncproxy_base_url = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html
Dec 03 00:28:36 compute-1 nova_compute[187157]:   novncproxy_host = 0.0.0.0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   novncproxy_port = 6080
Dec 03 00:28:36 compute-1 nova_compute[187157]:   server_listen = ::0
Dec 03 00:28:36 compute-1 nova_compute[187157]:   server_proxyclient_address = 192.168.122.101
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vencrypt_ca_certs = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vencrypt_client_cert = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   vencrypt_client_key = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: workarounds: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_compute_service_check_for_ffu = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_deep_image_inspection = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_fallback_pcpu_query = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_group_policy_check_upcall = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_libvirt_livesnapshot = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   disable_rootwrap = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enable_numa_live_migration = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   enable_qemu_monitor_announce_self = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ensure_libvirt_rbd_instance_dir_cleanup = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   handle_virt_lifecycle_events = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   libvirt_disable_apic = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   never_download_image_if_on_rbd = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   qemu_monitor_announce_self_count = 3
Dec 03 00:28:36 compute-1 nova_compute[187157]:   qemu_monitor_announce_self_interval = 1
Dec 03 00:28:36 compute-1 nova_compute[187157]:   reserve_disk_resource_for_image_cache = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   skip_cpu_compare_at_startup = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   skip_cpu_compare_on_dest = True
Dec 03 00:28:36 compute-1 nova_compute[187157]:   skip_hypervisor_version_check_on_lm = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   skip_reserve_in_use_ironic_nodes = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   unified_limits_count_pcpu_as_vcpu = False
Dec 03 00:28:36 compute-1 nova_compute[187157]:   wait_for_vif_plugged_event_during_hard_reboot = 
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: wsgi: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   api_paste_config = api-paste.ini
Dec 03 00:28:36 compute-1 nova_compute[187157]:   secure_proxy_ssl_header = None
Dec 03 00:28:36 compute-1 nova_compute[187157]: 
Dec 03 00:28:36 compute-1 nova_compute[187157]: zvm: 
Dec 03 00:28:36 compute-1 nova_compute[187157]:   ca_file = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   cloud_connector_url = None
Dec 03 00:28:36 compute-1 nova_compute[187157]:   image_tmp_path = /var/lib/nova/images
Dec 03 00:28:36 compute-1 nova_compute[187157]:   reachable_timeout = 300
Dec 03 00:28:37 compute-1 nova_compute[187157]: 2025-12-03 00:28:37.008 187161 DEBUG nova.virt.libvirt.driver [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:28:37 compute-1 nova_compute[187157]: 2025-12-03 00:28:37.009 187161 DEBUG nova.virt.libvirt.driver [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:28:37 compute-1 nova_compute[187157]: 2025-12-03 00:28:37.009 187161 DEBUG nova.virt.libvirt.driver [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:28:37 compute-1 nova_compute[187157]: 2025-12-03 00:28:37.009 187161 DEBUG nova.virt.libvirt.driver [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] No VIF found with MAC fa:16:3e:9b:03:35, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:28:37 compute-1 nova_compute[187157]: 2025-12-03 00:28:37.887 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:38 compute-1 nova_compute[187157]: 2025-12-03 00:28:38.679 187161 DEBUG oslo_concurrency.lockutils [None req-d6bc4092-5852-4c23-9727-5ba8d8c4d055 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 7.792s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:40 compute-1 podman[222704]: 2025-12-03 00:28:40.260308653 +0000 UTC m=+0.086426880 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350)
Dec 03 00:28:40 compute-1 nova_compute[187157]: 2025-12-03 00:28:40.871 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:42 compute-1 nova_compute[187157]: 2025-12-03 00:28:42.890 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:43 compute-1 podman[222727]: 2025-12-03 00:28:43.221260837 +0000 UTC m=+0.065483264 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 03 00:28:44 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:28:45 compute-1 nova_compute[187157]: 2025-12-03 00:28:45.872 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:47 compute-1 nova_compute[187157]: 2025-12-03 00:28:47.900 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: ERROR   00:28:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: ERROR   00:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: ERROR   00:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: ERROR   00:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: ERROR   00:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:28:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:28:50 compute-1 ovn_controller[95464]: 2025-12-03T00:28:50Z|00313|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:28:50 compute-1 nova_compute[187157]: 2025-12-03 00:28:50.874 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:52 compute-1 podman[222749]: 2025-12-03 00:28:52.238480003 +0000 UTC m=+0.064143312 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:28:52 compute-1 nova_compute[187157]: 2025-12-03 00:28:52.903 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:55 compute-1 nova_compute[187157]: 2025-12-03 00:28:55.875 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:56 compute-1 podman[222773]: 2025-12-03 00:28:56.249736777 +0000 UTC m=+0.070958316 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 03 00:28:56 compute-1 podman[222774]: 2025-12-03 00:28:56.310940925 +0000 UTC m=+0.119605381 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:28:57 compute-1 nova_compute[187157]: 2025-12-03 00:28:57.211 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:57 compute-1 nova_compute[187157]: 2025-12-03 00:28:57.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:57 compute-1 nova_compute[187157]: 2025-12-03 00:28:57.906 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:58 compute-1 nova_compute[187157]: 2025-12-03 00:28:58.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:59 compute-1 nova_compute[187157]: 2025-12-03 00:28:59.945 187161 DEBUG oslo_concurrency.lockutils [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:59 compute-1 nova_compute[187157]: 2025-12-03 00:28:59.946 187161 DEBUG oslo_concurrency.lockutils [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:00 compute-1 sshd-session[222819]: Invalid user solana from 45.148.10.240 port 52894
Dec 03 00:29:00 compute-1 sshd-session[222819]: Connection closed by invalid user solana 45.148.10.240 port 52894 [preauth]
Dec 03 00:29:00 compute-1 nova_compute[187157]: 2025-12-03 00:29:00.495 187161 DEBUG nova.objects.instance [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lazy-loading 'flavor' on Instance uuid 1c32f4c5-c959-44c9-be90-2e0a08b52619 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:29:00 compute-1 nova_compute[187157]: 2025-12-03 00:29:00.906 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:01.758 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:01.758 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:01.759 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:01 compute-1 nova_compute[187157]: 2025-12-03 00:29:01.822 187161 INFO nova.compute.manager [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Detaching volume fec9baad-7985-4f1a-a1d4-0469965aa4e9
Dec 03 00:29:01 compute-1 nova_compute[187157]: 2025-12-03 00:29:01.926 187161 INFO nova.virt.block_device [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Attempting to driver detach volume fec9baad-7985-4f1a-a1d4-0469965aa4e9 from mountpoint /dev/vdb
Dec 03 00:29:01 compute-1 nova_compute[187157]: 2025-12-03 00:29:01.934 187161 DEBUG nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Found disk vdb by alias ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Dec 03 00:29:01 compute-1 nova_compute[187157]: 2025-12-03 00:29:01.936 187161 DEBUG nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Found disk vdb by alias ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Dec 03 00:29:01 compute-1 nova_compute[187157]: 2025-12-03 00:29:01.937 187161 DEBUG nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Attempting to detach device vdb from instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 from the persistent domain config. _detach_from_persistent /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2576
Dec 03 00:29:01 compute-1 nova_compute[187157]: 2025-12-03 00:29:01.937 187161 DEBUG nova.virt.libvirt.guest [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] detach device xml: <disk type="file" device="disk">
Dec 03 00:29:01 compute-1 nova_compute[187157]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Dec 03 00:29:01 compute-1 nova_compute[187157]:   <alias name="ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9"/>
Dec 03 00:29:01 compute-1 nova_compute[187157]:   <source file="/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540/volume-fec9baad-7985-4f1a-a1d4-0469965aa4e9"/>
Dec 03 00:29:01 compute-1 nova_compute[187157]:   <target dev="vdb" bus="virtio"/>
Dec 03 00:29:01 compute-1 nova_compute[187157]:   <serial>fec9baad-7985-4f1a-a1d4-0469965aa4e9</serial>
Dec 03 00:29:01 compute-1 nova_compute[187157]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Dec 03 00:29:01 compute-1 nova_compute[187157]: </disk>
Dec 03 00:29:01 compute-1 nova_compute[187157]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.010 187161 DEBUG nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Found disk vdb by alias ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.010 187161 WARNING nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Failed to detach device vdb from instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 from the persistent domain config. Libvirt did not report any error but the device is still in the config.
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.011 187161 DEBUG nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] (1/8): Attempting to detach device vdb with device alias ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9 from instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2612
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.012 187161 DEBUG nova.virt.libvirt.guest [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] detach device xml: <disk type="file" device="disk">
Dec 03 00:29:02 compute-1 nova_compute[187157]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Dec 03 00:29:02 compute-1 nova_compute[187157]:   <alias name="ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9"/>
Dec 03 00:29:02 compute-1 nova_compute[187157]:   <source file="/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540/volume-fec9baad-7985-4f1a-a1d4-0469965aa4e9"/>
Dec 03 00:29:02 compute-1 nova_compute[187157]:   <target dev="vdb" bus="virtio"/>
Dec 03 00:29:02 compute-1 nova_compute[187157]:   <serial>fec9baad-7985-4f1a-a1d4-0469965aa4e9</serial>
Dec 03 00:29:02 compute-1 nova_compute[187157]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Dec 03 00:29:02 compute-1 nova_compute[187157]: </disk>
Dec 03 00:29:02 compute-1 nova_compute[187157]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.085 187161 DEBUG nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Start waiting for the detach event from libvirt for device vdb with device alias ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9 for instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 _detach_from_live_and_wait_for_event /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2688
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:02 compute-1 nova_compute[187157]: 2025-12-03 00:29:02.910 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:03 compute-1 nova_compute[187157]: 2025-12-03 00:29:03.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:03 compute-1 nova_compute[187157]: 2025-12-03 00:29:03.213 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:03 compute-1 nova_compute[187157]: 2025-12-03 00:29:03.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:03 compute-1 nova_compute[187157]: 2025-12-03 00:29:03.214 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.404 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.455 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.456 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.507 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.640 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.641 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.657 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.658 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5574MB free_disk=73.13209915161133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.658 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:04 compute-1 nova_compute[187157]: 2025-12-03 00:29:04.658 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:05 compute-1 podman[197537]: time="2025-12-03T00:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:29:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:29:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.705 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.706 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.706 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:29:04 up  1:36,  0 user,  load average: 0.30, 0.27, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_079699d388d64224949dbfaf77fa93bd': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.846 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.944 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.956 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.957 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.974 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:29:05 compute-1 nova_compute[187157]: 2025-12-03 00:29:05.990 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:29:06 compute-1 nova_compute[187157]: 2025-12-03 00:29:06.021 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:29:07 compute-1 nova_compute[187157]: 2025-12-03 00:29:07.255 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:29:07 compute-1 nova_compute[187157]: 2025-12-03 00:29:07.774 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:29:07 compute-1 nova_compute[187157]: 2025-12-03 00:29:07.775 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:07 compute-1 nova_compute[187157]: 2025-12-03 00:29:07.913 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:09 compute-1 nova_compute[187157]: 2025-12-03 00:29:09.770 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:09 compute-1 nova_compute[187157]: 2025-12-03 00:29:09.771 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:09 compute-1 nova_compute[187157]: 2025-12-03 00:29:09.771 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:29:10 compute-1 nova_compute[187157]: 2025-12-03 00:29:10.943 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:11 compute-1 podman[222833]: 2025-12-03 00:29:11.263953625 +0000 UTC m=+0.092800994 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec 03 00:29:11 compute-1 nova_compute[187157]: 2025-12-03 00:29:11.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:12 compute-1 nova_compute[187157]: 2025-12-03 00:29:12.915 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:14 compute-1 podman[222854]: 2025-12-03 00:29:14.226503657 +0000 UTC m=+0.062392469 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:29:15 compute-1 nova_compute[187157]: 2025-12-03 00:29:15.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:15 compute-1 nova_compute[187157]: 2025-12-03 00:29:15.947 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:17 compute-1 nova_compute[187157]: 2025-12-03 00:29:17.917 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: ERROR   00:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: ERROR   00:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: ERROR   00:29:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: ERROR   00:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: ERROR   00:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:29:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:29:19 compute-1 nova_compute[187157]: 2025-12-03 00:29:19.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:20 compute-1 nova_compute[187157]: 2025-12-03 00:29:20.950 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.087 187161 WARNING nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Waiting for libvirt event about the detach of device vdb with device alias ua-fec9baad-7985-4f1a-a1d4-0469965aa4e9 from instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 is timed out.
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.092 187161 INFO nova.virt.libvirt.driver [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Successfully detached device vdb from instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 from the live domain config.
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.094 187161 DEBUG nova.virt.libvirt.volume.mount [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Got _HostMountState generation 0 get_state /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:91
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.094 187161 DEBUG nova.virt.libvirt.volume.mount [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] _HostMountState.umount(vol_name=volume-fec9baad-7985-4f1a-a1d4-0469965aa4e9, mountpoint=/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540) generation 0 umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:349
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.099 187161 DEBUG nova.virt.libvirt.volume.mount [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Unmounting /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 _real_umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:382
Dec 03 00:29:22 compute-1 systemd[1]: var-lib-nova-mnt-cec891824fc057f7ee63f2ed70041540.mount: Deactivated successfully.
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.142 187161 DEBUG nova.virt.libvirt.volume.mount [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] _HostMountState.umount() for /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 completed successfully umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:372
Dec 03 00:29:22 compute-1 nova_compute[187157]: 2025-12-03 00:29:22.921 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:23 compute-1 podman[222887]: 2025-12-03 00:29:23.231579951 +0000 UTC m=+0.065737328 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:29:23 compute-1 nova_compute[187157]: 2025-12-03 00:29:23.516 187161 DEBUG oslo_concurrency.lockutils [None req-6ed70256-f91e-4fa7-92cb-7d37478955bb bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 23.570s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.952 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.963 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.964 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.964 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.965 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.965 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:25 compute-1 nova_compute[187157]: 2025-12-03 00:29:25.981 187161 INFO nova.compute.manager [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Terminating instance
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.495 187161 DEBUG nova.compute.manager [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:29:26 compute-1 kernel: tap21b5b048-03 (unregistering): left promiscuous mode
Dec 03 00:29:26 compute-1 NetworkManager[55553]: <info>  [1764721766.5182] device (tap21b5b048-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:29:26 compute-1 ovn_controller[95464]: 2025-12-03T00:29:26Z|00314|binding|INFO|Releasing lport 21b5b048-03fd-4cce-b80e-426f2c35c56a from this chassis (sb_readonly=0)
Dec 03 00:29:26 compute-1 ovn_controller[95464]: 2025-12-03T00:29:26Z|00315|binding|INFO|Setting lport 21b5b048-03fd-4cce-b80e-426f2c35c56a down in Southbound
Dec 03 00:29:26 compute-1 ovn_controller[95464]: 2025-12-03T00:29:26Z|00316|binding|INFO|Removing iface tap21b5b048-03 ovn-installed in OVS
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.526 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.539 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:03:35 10.100.0.8'], port_security=['fa:16:3e:9b:03:35 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1c32f4c5-c959-44c9-be90-2e0a08b52619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '079699d388d64224949dbfaf77fa93bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'de55cd34-9754-4d67-ad85-e10a26bc577b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41d502de-899a-45f5-a018-49c03d644872, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>], logical_port=21b5b048-03fd-4cce-b80e-426f2c35c56a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f0a1b6ee6f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.540 104348 INFO neutron.agent.ovn.metadata.agent [-] Port 21b5b048-03fd-4cce-b80e-426f2c35c56a in datapath 47c9dea6-51f8-4918-b7de-0893eb139352 unbound from our chassis
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.541 104348 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47c9dea6-51f8-4918-b7de-0893eb139352, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.542 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[38321ade-dcc7-4227-a3c8-7356e3323286]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.542 104348 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352 namespace which is not needed anymore
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.542 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:26 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000022.scope: Deactivated successfully.
Dec 03 00:29:26 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000022.scope: Consumed 16.387s CPU time.
Dec 03 00:29:26 compute-1 podman[222912]: 2025-12-03 00:29:26.601920958 +0000 UTC m=+0.051498725 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:29:26 compute-1 systemd-machined[153454]: Machine qemu-29-instance-00000022 terminated.
Dec 03 00:29:26 compute-1 podman[222916]: 2025-12-03 00:29:26.638259346 +0000 UTC m=+0.084598725 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:29:26 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [NOTICE]   (222566) : haproxy version is 3.0.5-8e879a5
Dec 03 00:29:26 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [NOTICE]   (222566) : path to executable is /usr/sbin/haproxy
Dec 03 00:29:26 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [WARNING]  (222566) : Exiting Master process...
Dec 03 00:29:26 compute-1 podman[222968]: 2025-12-03 00:29:26.651135788 +0000 UTC m=+0.029371751 container kill b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 03 00:29:26 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [ALERT]    (222566) : Current worker (222574) exited with code 143 (Terminated)
Dec 03 00:29:26 compute-1 neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352[222549]: [WARNING]  (222566) : All workers exited. Exiting... (0)
Dec 03 00:29:26 compute-1 systemd[1]: libpod-b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8.scope: Deactivated successfully.
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.713 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.717 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.752 187161 INFO nova.virt.libvirt.driver [-] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Instance destroyed successfully.
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.753 187161 DEBUG nova.objects.instance [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lazy-loading 'resources' on Instance uuid 1c32f4c5-c959-44c9-be90-2e0a08b52619 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.795 187161 DEBUG nova.compute.manager [req-cd431bc9-85be-41bd-be0b-0d8a186e22d8 req-ff9dc503-3783-42cc-b813-91329ee4b03d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-unplugged-21b5b048-03fd-4cce-b80e-426f2c35c56a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.795 187161 DEBUG oslo_concurrency.lockutils [req-cd431bc9-85be-41bd-be0b-0d8a186e22d8 req-ff9dc503-3783-42cc-b813-91329ee4b03d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.795 187161 DEBUG oslo_concurrency.lockutils [req-cd431bc9-85be-41bd-be0b-0d8a186e22d8 req-ff9dc503-3783-42cc-b813-91329ee4b03d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.795 187161 DEBUG oslo_concurrency.lockutils [req-cd431bc9-85be-41bd-be0b-0d8a186e22d8 req-ff9dc503-3783-42cc-b813-91329ee4b03d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.795 187161 DEBUG nova.compute.manager [req-cd431bc9-85be-41bd-be0b-0d8a186e22d8 req-ff9dc503-3783-42cc-b813-91329ee4b03d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] No waiting events found dispatching network-vif-unplugged-21b5b048-03fd-4cce-b80e-426f2c35c56a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.795 187161 DEBUG nova.compute.manager [req-cd431bc9-85be-41bd-be0b-0d8a186e22d8 req-ff9dc503-3783-42cc-b813-91329ee4b03d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-unplugged-21b5b048-03fd-4cce-b80e-426f2c35c56a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:29:26 compute-1 podman[223019]: 2025-12-03 00:29:26.862655599 +0000 UTC m=+0.023441408 container died b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:29:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-8233503d943a27ff82aef839dd2ce0b83e3165669e489482cda93cc9c1c7aac8-merged.mount: Deactivated successfully.
Dec 03 00:29:26 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8-userdata-shm.mount: Deactivated successfully.
Dec 03 00:29:26 compute-1 podman[223019]: 2025-12-03 00:29:26.911771056 +0000 UTC m=+0.072556835 container remove b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.916 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[7693dd8a-df9c-423b-977a-ed2ff8cd2110]: (4, ("Wed Dec  3 12:29:26 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352 (b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8)\nb2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8\nWed Dec  3 12:29:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352 (b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8)\nb2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.918 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[e7704741-8d10-4e63-8e2a-713c0e68a505]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.918 104348 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/47c9dea6-51f8-4918-b7de-0893eb139352.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.919 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[be8dc81f-9a91-4bdb-9a5c-debeccf4a15d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.920 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47c9dea6-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:29:26 compute-1 systemd[1]: libpod-conmon-b2adc348e63a9482aaceaa78f8c3c5b0ce953ebb29453554f69a163f48efe9a8.scope: Deactivated successfully.
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.970 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:26 compute-1 kernel: tap47c9dea6-50: left promiscuous mode
Dec 03 00:29:26 compute-1 nova_compute[187157]: 2025-12-03 00:29:26.984 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:26 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.986 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb776ff-64c0-45e5-bf12-cac3e858adc8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:26.999 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a299a5-3a0f-4f29-a815-12a3dd8eff81]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:27.000 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[59d3e054-3c61-4b48-b57a-227ada668fde]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:27.014 207957 DEBUG oslo.privsep.daemon [-] privsep: reply[66b8ab82-12aa-4667-9804-401d6af0472e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572020, 'reachable_time': 16263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223041, 'error': None, 'target': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:27.016 104464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:27.016 104464 DEBUG oslo.privsep.daemon [-] privsep: reply[6197bd1f-368e-4c88-8e9f-e868a3819f3a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:29:27 compute-1 systemd[1]: run-netns-ovnmeta\x2d47c9dea6\x2d51f8\x2d4918\x2db7de\x2d0893eb139352.mount: Deactivated successfully.
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.317 187161 DEBUG nova.virt.libvirt.vif [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:28:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1212261301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-121226130',id=34,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:28:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='079699d388d64224949dbfaf77fa93bd',ramdisk_id='',reservation_id='r-85br1zms',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1776646898-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:28:22Z,user_data=None,user_id='bc59879cc7d442cb9c60a8c6aebf4e24',uuid=1c32f4c5-c959-44c9-be90-2e0a08b52619,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.318 187161 DEBUG nova.network.os_vif_util [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Converting VIF {"id": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "address": "fa:16:3e:9b:03:35", "network": {"id": "47c9dea6-51f8-4918-b7de-0893eb139352", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-629213992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5bcb6274878430cbf268fcd97e3d9d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21b5b048-03", "ovs_interfaceid": "21b5b048-03fd-4cce-b80e-426f2c35c56a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.319 187161 DEBUG nova.network.os_vif_util [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.319 187161 DEBUG os_vif [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.322 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.323 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21b5b048-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.325 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.327 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.328 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.329 187161 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=17b412c3-63f6-43eb-a173-c3a7b0cc96e5) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.330 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.331 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:27.332 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:29:27 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:27.334 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.335 187161 INFO os_vif [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:03:35,bridge_name='br-int',has_traffic_filtering=True,id=21b5b048-03fd-4cce-b80e-426f2c35c56a,network=Network(47c9dea6-51f8-4918-b7de-0893eb139352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21b5b048-03')
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.336 187161 INFO nova.virt.libvirt.driver [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Deleting instance files /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619_del
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.337 187161 INFO nova.virt.libvirt.driver [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Deletion of /var/lib/nova/instances/1c32f4c5-c959-44c9-be90-2e0a08b52619_del complete
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.853 187161 INFO nova.compute.manager [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Took 1.36 seconds to destroy the instance on the hypervisor.
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.854 187161 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.854 187161 DEBUG nova.compute.manager [-] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.855 187161 DEBUG nova.network.neutron [-] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.855 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:29:27 compute-1 nova_compute[187157]: 2025-12-03 00:29:27.992 187161 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.328 187161 DEBUG nova.compute.manager [req-67c6fe33-fa56-474a-9570-aa78aa44d8a8 req-f11dde7d-b2c8-481c-8257-3dc8b34c7a42 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-deleted-21b5b048-03fd-4cce-b80e-426f2c35c56a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.329 187161 INFO nova.compute.manager [req-67c6fe33-fa56-474a-9570-aa78aa44d8a8 req-f11dde7d-b2c8-481c-8257-3dc8b34c7a42 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Neutron deleted interface 21b5b048-03fd-4cce-b80e-426f2c35c56a; detaching it from the instance and deleting it from the info cache
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.329 187161 DEBUG nova.network.neutron [req-67c6fe33-fa56-474a-9570-aa78aa44d8a8 req-f11dde7d-b2c8-481c-8257-3dc8b34c7a42 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.762 187161 DEBUG nova.network.neutron [-] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.840 187161 DEBUG nova.compute.manager [req-67c6fe33-fa56-474a-9570-aa78aa44d8a8 req-f11dde7d-b2c8-481c-8257-3dc8b34c7a42 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Detach interface failed, port_id=21b5b048-03fd-4cce-b80e-426f2c35c56a, reason: Instance 1c32f4c5-c959-44c9-be90-2e0a08b52619 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.870 187161 DEBUG nova.compute.manager [req-dbad0186-0546-4729-b24b-8423de812752 req-d237cdea-4aa1-4e21-97bd-c10e16109e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-unplugged-21b5b048-03fd-4cce-b80e-426f2c35c56a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.870 187161 DEBUG oslo_concurrency.lockutils [req-dbad0186-0546-4729-b24b-8423de812752 req-d237cdea-4aa1-4e21-97bd-c10e16109e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.871 187161 DEBUG oslo_concurrency.lockutils [req-dbad0186-0546-4729-b24b-8423de812752 req-d237cdea-4aa1-4e21-97bd-c10e16109e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.871 187161 DEBUG oslo_concurrency.lockutils [req-dbad0186-0546-4729-b24b-8423de812752 req-d237cdea-4aa1-4e21-97bd-c10e16109e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.871 187161 DEBUG nova.compute.manager [req-dbad0186-0546-4729-b24b-8423de812752 req-d237cdea-4aa1-4e21-97bd-c10e16109e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] No waiting events found dispatching network-vif-unplugged-21b5b048-03fd-4cce-b80e-426f2c35c56a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:29:28 compute-1 nova_compute[187157]: 2025-12-03 00:29:28.871 187161 DEBUG nova.compute.manager [req-dbad0186-0546-4729-b24b-8423de812752 req-d237cdea-4aa1-4e21-97bd-c10e16109e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Received event network-vif-unplugged-21b5b048-03fd-4cce-b80e-426f2c35c56a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:29:29 compute-1 nova_compute[187157]: 2025-12-03 00:29:29.270 187161 INFO nova.compute.manager [-] [instance: 1c32f4c5-c959-44c9-be90-2e0a08b52619] Took 1.41 seconds to deallocate network for instance.
Dec 03 00:29:29 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:29:29.336 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:29:29 compute-1 nova_compute[187157]: 2025-12-03 00:29:29.794 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:29 compute-1 nova_compute[187157]: 2025-12-03 00:29:29.794 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:29 compute-1 nova_compute[187157]: 2025-12-03 00:29:29.870 187161 DEBUG nova.compute.provider_tree [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:29:30 compute-1 nova_compute[187157]: 2025-12-03 00:29:30.379 187161 DEBUG nova.scheduler.client.report [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:29:30 compute-1 nova_compute[187157]: 2025-12-03 00:29:30.892 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:30 compute-1 nova_compute[187157]: 2025-12-03 00:29:30.923 187161 INFO nova.scheduler.client.report [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Deleted allocations for instance 1c32f4c5-c959-44c9-be90-2e0a08b52619
Dec 03 00:29:31 compute-1 nova_compute[187157]: 2025-12-03 00:29:31.154 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:31 compute-1 nova_compute[187157]: 2025-12-03 00:29:31.950 187161 DEBUG oslo_concurrency.lockutils [None req-8ef08152-47b5-4c45-9b52-0ed2003117b2 bc59879cc7d442cb9c60a8c6aebf4e24 079699d388d64224949dbfaf77fa93bd - - default default] Lock "1c32f4c5-c959-44c9-be90-2e0a08b52619" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.987s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:32 compute-1 nova_compute[187157]: 2025-12-03 00:29:32.330 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:35 compute-1 podman[197537]: time="2025-12-03T00:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:29:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:29:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Dec 03 00:29:36 compute-1 nova_compute[187157]: 2025-12-03 00:29:36.156 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:37 compute-1 nova_compute[187157]: 2025-12-03 00:29:37.333 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:38 compute-1 nova_compute[187157]: 2025-12-03 00:29:38.247 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:41 compute-1 nova_compute[187157]: 2025-12-03 00:29:41.157 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:42 compute-1 podman[223046]: 2025-12-03 00:29:42.209471705 +0000 UTC m=+0.052605973 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Dec 03 00:29:42 compute-1 nova_compute[187157]: 2025-12-03 00:29:42.334 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:45 compute-1 podman[223068]: 2025-12-03 00:29:45.201211411 +0000 UTC m=+0.049164719 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 03 00:29:46 compute-1 nova_compute[187157]: 2025-12-03 00:29:46.159 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:47 compute-1 nova_compute[187157]: 2025-12-03 00:29:47.373 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: ERROR   00:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: ERROR   00:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: ERROR   00:29:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: ERROR   00:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: ERROR   00:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:29:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:29:51 compute-1 nova_compute[187157]: 2025-12-03 00:29:51.161 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:52 compute-1 nova_compute[187157]: 2025-12-03 00:29:52.375 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:54 compute-1 podman[223088]: 2025-12-03 00:29:54.235193153 +0000 UTC m=+0.073669561 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:29:56 compute-1 nova_compute[187157]: 2025-12-03 00:29:56.163 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:56 compute-1 nova_compute[187157]: 2025-12-03 00:29:56.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:57 compute-1 podman[223113]: 2025-12-03 00:29:57.202659984 +0000 UTC m=+0.048422790 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:29:57 compute-1 podman[223114]: 2025-12-03 00:29:57.247912347 +0000 UTC m=+0.084700127 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Dec 03 00:29:57 compute-1 nova_compute[187157]: 2025-12-03 00:29:57.376 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:59 compute-1 nova_compute[187157]: 2025-12-03 00:29:59.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:00 compute-1 nova_compute[187157]: 2025-12-03 00:30:00.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:01 compute-1 nova_compute[187157]: 2025-12-03 00:30:01.165 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:30:01.760 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:30:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:30:01.760 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:30:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:30:01.760 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:30:02 compute-1 nova_compute[187157]: 2025-12-03 00:30:02.441 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:04 compute-1 nova_compute[187157]: 2025-12-03 00:30:04.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.419 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.420 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.420 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.421 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.641 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.643 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:30:05 compute-1 podman[197537]: time="2025-12-03T00:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:30:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:30:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.666 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.667 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5755MB free_disk=73.16099166870117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.667 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:30:05 compute-1 nova_compute[187157]: 2025-12-03 00:30:05.668 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:30:06 compute-1 nova_compute[187157]: 2025-12-03 00:30:06.166 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:06 compute-1 nova_compute[187157]: 2025-12-03 00:30:06.803 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:30:06 compute-1 nova_compute[187157]: 2025-12-03 00:30:06.803 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:30:05 up  1:37,  0 user,  load average: 0.20, 0.25, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:30:06 compute-1 nova_compute[187157]: 2025-12-03 00:30:06.825 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:30:07 compute-1 nova_compute[187157]: 2025-12-03 00:30:07.331 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:30:07 compute-1 nova_compute[187157]: 2025-12-03 00:30:07.442 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:08 compute-1 nova_compute[187157]: 2025-12-03 00:30:08.066 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:30:08 compute-1 nova_compute[187157]: 2025-12-03 00:30:08.067 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.399s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:30:10 compute-1 nova_compute[187157]: 2025-12-03 00:30:10.062 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:10 compute-1 nova_compute[187157]: 2025-12-03 00:30:10.063 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:10 compute-1 nova_compute[187157]: 2025-12-03 00:30:10.063 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:30:11 compute-1 nova_compute[187157]: 2025-12-03 00:30:11.167 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:12 compute-1 nova_compute[187157]: 2025-12-03 00:30:12.445 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:13 compute-1 podman[223160]: 2025-12-03 00:30:13.214153613 +0000 UTC m=+0.055906942 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:30:13 compute-1 nova_compute[187157]: 2025-12-03 00:30:13.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:16 compute-1 nova_compute[187157]: 2025-12-03 00:30:16.169 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:16 compute-1 podman[223182]: 2025-12-03 00:30:16.213631727 +0000 UTC m=+0.052631183 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:30:16 compute-1 ovn_controller[95464]: 2025-12-03T00:30:16Z|00317|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Dec 03 00:30:16 compute-1 nova_compute[187157]: 2025-12-03 00:30:16.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:17 compute-1 nova_compute[187157]: 2025-12-03 00:30:17.446 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: ERROR   00:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: ERROR   00:30:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: ERROR   00:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: ERROR   00:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: ERROR   00:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:30:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:30:21 compute-1 nova_compute[187157]: 2025-12-03 00:30:21.171 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:22 compute-1 nova_compute[187157]: 2025-12-03 00:30:22.448 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:25 compute-1 podman[223202]: 2025-12-03 00:30:25.207292275 +0000 UTC m=+0.046657838 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:30:26 compute-1 nova_compute[187157]: 2025-12-03 00:30:26.171 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:27 compute-1 nova_compute[187157]: 2025-12-03 00:30:27.451 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:28 compute-1 podman[223224]: 2025-12-03 00:30:28.198567992 +0000 UTC m=+0.046137856 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 03 00:30:28 compute-1 podman[223225]: 2025-12-03 00:30:28.227203614 +0000 UTC m=+0.070548205 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:30:31 compute-1 nova_compute[187157]: 2025-12-03 00:30:31.173 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:32 compute-1 nova_compute[187157]: 2025-12-03 00:30:32.453 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:35 compute-1 podman[197537]: time="2025-12-03T00:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:30:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:30:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:30:36 compute-1 nova_compute[187157]: 2025-12-03 00:30:36.176 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:37 compute-1 nova_compute[187157]: 2025-12-03 00:30:37.454 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:41 compute-1 nova_compute[187157]: 2025-12-03 00:30:41.177 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:42 compute-1 nova_compute[187157]: 2025-12-03 00:30:42.456 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:44 compute-1 podman[223269]: 2025-12-03 00:30:44.218239159 +0000 UTC m=+0.055153414 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:30:46 compute-1 nova_compute[187157]: 2025-12-03 00:30:46.221 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:47 compute-1 sshd-session[223291]: Accepted publickey for zuul from 192.168.122.10 port 52696 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:30:47 compute-1 systemd-logind[790]: New session 39 of user zuul.
Dec 03 00:30:47 compute-1 systemd[1]: Started Session 39 of User zuul.
Dec 03 00:30:47 compute-1 sshd-session[223291]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:30:47 compute-1 podman[223293]: 2025-12-03 00:30:47.221226799 +0000 UTC m=+0.056372693 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:30:47 compute-1 sudo[223316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 03 00:30:47 compute-1 sudo[223316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:30:47 compute-1 nova_compute[187157]: 2025-12-03 00:30:47.457 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: ERROR   00:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: ERROR   00:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: ERROR   00:30:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: ERROR   00:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: ERROR   00:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:30:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:30:51 compute-1 nova_compute[187157]: 2025-12-03 00:30:51.222 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:51 compute-1 ovs-vsctl[223489]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 03 00:30:52 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 223340 (sos)
Dec 03 00:30:52 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 03 00:30:52 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 03 00:30:52 compute-1 nova_compute[187157]: 2025-12-03 00:30:52.459 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:52 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 03 00:30:52 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 03 00:30:52 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 00:30:53 compute-1 crontab[223910]: (root) LIST (root)
Dec 03 00:30:55 compute-1 systemd[1]: Starting Hostname Service...
Dec 03 00:30:56 compute-1 podman[224026]: 2025-12-03 00:30:56.042027019 +0000 UTC m=+0.068309942 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:30:56 compute-1 systemd[1]: Started Hostname Service.
Dec 03 00:30:56 compute-1 nova_compute[187157]: 2025-12-03 00:30:56.224 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:56 compute-1 nova_compute[187157]: 2025-12-03 00:30:56.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:57 compute-1 nova_compute[187157]: 2025-12-03 00:30:57.460 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:58 compute-1 podman[224290]: 2025-12-03 00:30:58.299068692 +0000 UTC m=+0.049483357 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:30:58 compute-1 podman[224293]: 2025-12-03 00:30:58.331228129 +0000 UTC m=+0.075542297 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:30:59 compute-1 nova_compute[187157]: 2025-12-03 00:30:59.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:01 compute-1 sshd-session[224579]: Invalid user sol from 45.148.10.240 port 36942
Dec 03 00:31:01 compute-1 sshd-session[224579]: Connection closed by invalid user sol 45.148.10.240 port 36942 [preauth]
Dec 03 00:31:01 compute-1 nova_compute[187157]: 2025-12-03 00:31:01.226 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:31:01.762 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:31:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:31:01.762 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:31:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:31:01.762 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:31:02 compute-1 nova_compute[187157]: 2025-12-03 00:31:02.461 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:02 compute-1 ovs-appctl[225239]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 00:31:02 compute-1 ovs-appctl[225250]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 00:31:02 compute-1 ovs-appctl[225257]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 00:31:02 compute-1 nova_compute[187157]: 2025-12-03 00:31:02.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:05 compute-1 podman[197537]: time="2025-12-03T00:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:31:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:31:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:31:05 compute-1 nova_compute[187157]: 2025-12-03 00:31:05.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.228 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.544 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.544 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.544 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.545 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.690 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.692 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.713 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.714 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5460MB free_disk=72.83904647827148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.714 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:31:06 compute-1 nova_compute[187157]: 2025-12-03 00:31:06.714 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:31:07 compute-1 nova_compute[187157]: 2025-12-03 00:31:07.464 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:08 compute-1 nova_compute[187157]: 2025-12-03 00:31:08.129 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:31:08 compute-1 nova_compute[187157]: 2025-12-03 00:31:08.129 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:31:06 up  1:38,  0 user,  load average: 0.84, 0.37, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:31:08 compute-1 nova_compute[187157]: 2025-12-03 00:31:08.150 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:31:08 compute-1 nova_compute[187157]: 2025-12-03 00:31:08.663 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:31:09 compute-1 nova_compute[187157]: 2025-12-03 00:31:09.171 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:31:09 compute-1 nova_compute[187157]: 2025-12-03 00:31:09.172 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.458s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:31:10 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 00:31:11 compute-1 nova_compute[187157]: 2025-12-03 00:31:11.229 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:11 compute-1 systemd[1]: Starting Time & Date Service...
Dec 03 00:31:11 compute-1 systemd[1]: Started Time & Date Service.
Dec 03 00:31:12 compute-1 nova_compute[187157]: 2025-12-03 00:31:12.167 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:12 compute-1 nova_compute[187157]: 2025-12-03 00:31:12.168 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:12 compute-1 nova_compute[187157]: 2025-12-03 00:31:12.168 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:31:12 compute-1 nova_compute[187157]: 2025-12-03 00:31:12.466 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:13 compute-1 nova_compute[187157]: 2025-12-03 00:31:13.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:15 compute-1 podman[226634]: 2025-12-03 00:31:15.232394347 +0000 UTC m=+0.072851681 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:31:16 compute-1 nova_compute[187157]: 2025-12-03 00:31:16.232 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:17 compute-1 nova_compute[187157]: 2025-12-03 00:31:17.467 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:17 compute-1 podman[226658]: 2025-12-03 00:31:17.524990789 +0000 UTC m=+0.062912772 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:31:18 compute-1 nova_compute[187157]: 2025-12-03 00:31:18.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: ERROR   00:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: ERROR   00:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: ERROR   00:31:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: ERROR   00:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: ERROR   00:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:31:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:31:19 compute-1 nova_compute[187157]: 2025-12-03 00:31:19.695 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:20 compute-1 sshd-session[226675]: Received disconnect from 117.5.148.56 port 34130:11:  [preauth]
Dec 03 00:31:20 compute-1 sshd-session[226675]: Disconnected from authenticating user root 117.5.148.56 port 34130 [preauth]
Dec 03 00:31:21 compute-1 nova_compute[187157]: 2025-12-03 00:31:21.234 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:22 compute-1 nova_compute[187157]: 2025-12-03 00:31:22.469 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:26 compute-1 podman[226677]: 2025-12-03 00:31:26.222197452 +0000 UTC m=+0.060972654 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:31:26 compute-1 nova_compute[187157]: 2025-12-03 00:31:26.236 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:27 compute-1 nova_compute[187157]: 2025-12-03 00:31:27.515 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:28 compute-1 podman[226701]: 2025-12-03 00:31:28.768769991 +0000 UTC m=+0.064036708 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:31:28 compute-1 podman[226702]: 2025-12-03 00:31:28.825687057 +0000 UTC m=+0.108862672 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:31:31 compute-1 nova_compute[187157]: 2025-12-03 00:31:31.285 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:32 compute-1 nova_compute[187157]: 2025-12-03 00:31:32.517 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:34 compute-1 sudo[223316]: pam_unix(sudo:session): session closed for user root
Dec 03 00:31:34 compute-1 sshd-session[223300]: Received disconnect from 192.168.122.10 port 52696:11: disconnected by user
Dec 03 00:31:34 compute-1 sshd-session[223300]: Disconnected from user zuul 192.168.122.10 port 52696
Dec 03 00:31:34 compute-1 sshd-session[223291]: pam_unix(sshd:session): session closed for user zuul
Dec 03 00:31:34 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Dec 03 00:31:34 compute-1 systemd[1]: session-39.scope: Consumed 1min 16.689s CPU time, 539.9M memory peak, read 133.9M from disk, written 31.1M to disk.
Dec 03 00:31:34 compute-1 systemd-logind[790]: Session 39 logged out. Waiting for processes to exit.
Dec 03 00:31:34 compute-1 systemd-logind[790]: Removed session 39.
Dec 03 00:31:34 compute-1 sshd-session[226746]: Accepted publickey for zuul from 192.168.122.10 port 49548 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:31:34 compute-1 systemd-logind[790]: New session 40 of user zuul.
Dec 03 00:31:34 compute-1 systemd[1]: Started Session 40 of User zuul.
Dec 03 00:31:34 compute-1 sshd-session[226746]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:31:35 compute-1 sudo[226750]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2025-12-03-udtysbx.tar.xz
Dec 03 00:31:35 compute-1 sudo[226750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:31:35 compute-1 sudo[226750]: pam_unix(sudo:session): session closed for user root
Dec 03 00:31:35 compute-1 sshd-session[226749]: Received disconnect from 192.168.122.10 port 49548:11: disconnected by user
Dec 03 00:31:35 compute-1 sshd-session[226749]: Disconnected from user zuul 192.168.122.10 port 49548
Dec 03 00:31:35 compute-1 sshd-session[226746]: pam_unix(sshd:session): session closed for user zuul
Dec 03 00:31:35 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Dec 03 00:31:35 compute-1 systemd-logind[790]: Session 40 logged out. Waiting for processes to exit.
Dec 03 00:31:35 compute-1 systemd-logind[790]: Removed session 40.
Dec 03 00:31:35 compute-1 sshd-session[226775]: Accepted publickey for zuul from 192.168.122.10 port 49558 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:31:35 compute-1 systemd-logind[790]: New session 41 of user zuul.
Dec 03 00:31:35 compute-1 systemd[1]: Started Session 41 of User zuul.
Dec 03 00:31:35 compute-1 sshd-session[226775]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:31:35 compute-1 sudo[226779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 03 00:31:35 compute-1 sudo[226779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:31:35 compute-1 sudo[226779]: pam_unix(sudo:session): session closed for user root
Dec 03 00:31:35 compute-1 sshd-session[226778]: Received disconnect from 192.168.122.10 port 49558:11: disconnected by user
Dec 03 00:31:35 compute-1 sshd-session[226778]: Disconnected from user zuul 192.168.122.10 port 49558
Dec 03 00:31:35 compute-1 sshd-session[226775]: pam_unix(sshd:session): session closed for user zuul
Dec 03 00:31:35 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Dec 03 00:31:35 compute-1 systemd-logind[790]: Session 41 logged out. Waiting for processes to exit.
Dec 03 00:31:35 compute-1 systemd-logind[790]: Removed session 41.
Dec 03 00:31:35 compute-1 podman[197537]: time="2025-12-03T00:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:31:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:31:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2613 "" "Go-http-client/1.1"
Dec 03 00:31:36 compute-1 nova_compute[187157]: 2025-12-03 00:31:36.287 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:37 compute-1 nova_compute[187157]: 2025-12-03 00:31:37.519 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:41 compute-1 nova_compute[187157]: 2025-12-03 00:31:41.289 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:41 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 03 00:31:41 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 00:31:42 compute-1 nova_compute[187157]: 2025-12-03 00:31:42.520 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:46 compute-1 podman[226809]: 2025-12-03 00:31:46.264502149 +0000 UTC m=+0.099151698 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible)
Dec 03 00:31:46 compute-1 nova_compute[187157]: 2025-12-03 00:31:46.291 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:47 compute-1 nova_compute[187157]: 2025-12-03 00:31:47.522 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:48 compute-1 podman[226830]: 2025-12-03 00:31:48.205525314 +0000 UTC m=+0.048900512 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: ERROR   00:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: ERROR   00:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: ERROR   00:31:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: ERROR   00:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: ERROR   00:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:31:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:31:51 compute-1 nova_compute[187157]: 2025-12-03 00:31:51.294 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:52 compute-1 nova_compute[187157]: 2025-12-03 00:31:52.524 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:56 compute-1 nova_compute[187157]: 2025-12-03 00:31:56.334 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:57 compute-1 podman[226850]: 2025-12-03 00:31:57.200233177 +0000 UTC m=+0.046417142 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:31:57 compute-1 nova_compute[187157]: 2025-12-03 00:31:57.568 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:57 compute-1 nova_compute[187157]: 2025-12-03 00:31:57.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:59 compute-1 podman[226874]: 2025-12-03 00:31:59.231405012 +0000 UTC m=+0.061333233 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:31:59 compute-1 podman[226875]: 2025-12-03 00:31:59.263226321 +0000 UTC m=+0.097389974 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:31:59 compute-1 nova_compute[187157]: 2025-12-03 00:31:59.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:01 compute-1 nova_compute[187157]: 2025-12-03 00:32:01.337 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:32:01.763 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:32:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:32:01.763 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:32:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:32:01.763 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:32:02 compute-1 nova_compute[187157]: 2025-12-03 00:32:02.571 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:02 compute-1 nova_compute[187157]: 2025-12-03 00:32:02.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:05 compute-1 podman[197537]: time="2025-12-03T00:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:32:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:32:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Dec 03 00:32:05 compute-1 nova_compute[187157]: 2025-12-03 00:32:05.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.338 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.373 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.374 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.395 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.396 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5692MB free_disk=73.16062545776367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.396 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:32:06 compute-1 nova_compute[187157]: 2025-12-03 00:32:06.396 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:32:07 compute-1 nova_compute[187157]: 2025-12-03 00:32:07.547 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:32:07 compute-1 nova_compute[187157]: 2025-12-03 00:32:07.548 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:32:06 up  1:39,  0 user,  load average: 0.54, 0.38, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:32:07 compute-1 nova_compute[187157]: 2025-12-03 00:32:07.570 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:32:07 compute-1 nova_compute[187157]: 2025-12-03 00:32:07.575 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:08 compute-1 nova_compute[187157]: 2025-12-03 00:32:08.088 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:32:08 compute-1 nova_compute[187157]: 2025-12-03 00:32:08.628 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:32:08 compute-1 nova_compute[187157]: 2025-12-03 00:32:08.629 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.233s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:32:11 compute-1 nova_compute[187157]: 2025-12-03 00:32:11.341 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:12 compute-1 nova_compute[187157]: 2025-12-03 00:32:12.577 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:12 compute-1 nova_compute[187157]: 2025-12-03 00:32:12.624 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:12 compute-1 nova_compute[187157]: 2025-12-03 00:32:12.625 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:12 compute-1 nova_compute[187157]: 2025-12-03 00:32:12.625 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:32:13 compute-1 nova_compute[187157]: 2025-12-03 00:32:13.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:16 compute-1 nova_compute[187157]: 2025-12-03 00:32:16.398 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:17 compute-1 podman[226924]: 2025-12-03 00:32:17.244585162 +0000 UTC m=+0.085535328 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal)
Dec 03 00:32:17 compute-1 nova_compute[187157]: 2025-12-03 00:32:17.613 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:19 compute-1 podman[226945]: 2025-12-03 00:32:19.242947084 +0000 UTC m=+0.080042865 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: ERROR   00:32:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: ERROR   00:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: ERROR   00:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: ERROR   00:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: ERROR   00:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:32:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:32:19 compute-1 nova_compute[187157]: 2025-12-03 00:32:19.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:21 compute-1 nova_compute[187157]: 2025-12-03 00:32:21.398 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:22 compute-1 nova_compute[187157]: 2025-12-03 00:32:22.614 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:26 compute-1 nova_compute[187157]: 2025-12-03 00:32:26.400 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:27 compute-1 nova_compute[187157]: 2025-12-03 00:32:27.617 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:28 compute-1 podman[226965]: 2025-12-03 00:32:28.200620952 +0000 UTC m=+0.047007798 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:32:30 compute-1 podman[226990]: 2025-12-03 00:32:30.21414693 +0000 UTC m=+0.058597337 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:32:30 compute-1 podman[226991]: 2025-12-03 00:32:30.273539825 +0000 UTC m=+0.108191165 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:32:31 compute-1 nova_compute[187157]: 2025-12-03 00:32:31.402 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:32 compute-1 nova_compute[187157]: 2025-12-03 00:32:32.661 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:35 compute-1 podman[197537]: time="2025-12-03T00:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:32:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:32:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2613 "" "Go-http-client/1.1"
Dec 03 00:32:36 compute-1 nova_compute[187157]: 2025-12-03 00:32:36.403 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:37 compute-1 nova_compute[187157]: 2025-12-03 00:32:37.662 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:41 compute-1 nova_compute[187157]: 2025-12-03 00:32:41.454 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:42 compute-1 nova_compute[187157]: 2025-12-03 00:32:42.705 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:46 compute-1 nova_compute[187157]: 2025-12-03 00:32:46.507 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:47 compute-1 nova_compute[187157]: 2025-12-03 00:32:47.707 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:48 compute-1 podman[227034]: 2025-12-03 00:32:48.219852128 +0000 UTC m=+0.058806082 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: ERROR   00:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: ERROR   00:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: ERROR   00:32:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: ERROR   00:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: ERROR   00:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:32:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:32:50 compute-1 podman[227055]: 2025-12-03 00:32:50.225782153 +0000 UTC m=+0.068926618 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Dec 03 00:32:51 compute-1 nova_compute[187157]: 2025-12-03 00:32:51.535 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:52 compute-1 nova_compute[187157]: 2025-12-03 00:32:52.758 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:56 compute-1 nova_compute[187157]: 2025-12-03 00:32:56.574 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:57 compute-1 nova_compute[187157]: 2025-12-03 00:32:57.760 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:58 compute-1 nova_compute[187157]: 2025-12-03 00:32:58.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:59 compute-1 podman[227076]: 2025-12-03 00:32:59.263408702 +0000 UTC m=+0.095040687 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:33:00 compute-1 nova_compute[187157]: 2025-12-03 00:33:00.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:00 compute-1 nova_compute[187157]: 2025-12-03 00:33:00.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:01 compute-1 podman[227100]: 2025-12-03 00:33:01.208323983 +0000 UTC m=+0.051140857 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:33:01 compute-1 podman[227101]: 2025-12-03 00:33:01.266095599 +0000 UTC m=+0.103924052 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:33:01 compute-1 nova_compute[187157]: 2025-12-03 00:33:01.576 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:33:01.764 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:33:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:33:01.765 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:33:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:33:01.765 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:33:02 compute-1 nova_compute[187157]: 2025-12-03 00:33:02.762 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:04 compute-1 nova_compute[187157]: 2025-12-03 00:33:04.205 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:05 compute-1 podman[197537]: time="2025-12-03T00:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:33:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:33:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2615 "" "Go-http-client/1.1"
Dec 03 00:33:06 compute-1 nova_compute[187157]: 2025-12-03 00:33:06.593 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:06 compute-1 nova_compute[187157]: 2025-12-03 00:33:06.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.215 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.216 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.416 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.418 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.447 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.448 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5729MB free_disk=73.16085052490234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.448 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.448 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:33:07 compute-1 nova_compute[187157]: 2025-12-03 00:33:07.806 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:08 compute-1 nova_compute[187157]: 2025-12-03 00:33:08.488 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:33:08 compute-1 nova_compute[187157]: 2025-12-03 00:33:08.489 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:33:07 up  1:40,  0 user,  load average: 0.21, 0.32, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:33:08 compute-1 nova_compute[187157]: 2025-12-03 00:33:08.513 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:33:08 compute-1 sshd-session[227146]: Invalid user sol from 45.148.10.240 port 40066
Dec 03 00:33:08 compute-1 sshd-session[227146]: Connection closed by invalid user sol 45.148.10.240 port 40066 [preauth]
Dec 03 00:33:09 compute-1 nova_compute[187157]: 2025-12-03 00:33:09.021 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:33:09 compute-1 nova_compute[187157]: 2025-12-03 00:33:09.542 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:33:09 compute-1 nova_compute[187157]: 2025-12-03 00:33:09.543 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:33:11 compute-1 nova_compute[187157]: 2025-12-03 00:33:11.640 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:12 compute-1 nova_compute[187157]: 2025-12-03 00:33:12.808 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:14 compute-1 nova_compute[187157]: 2025-12-03 00:33:14.539 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:14 compute-1 nova_compute[187157]: 2025-12-03 00:33:14.539 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:14 compute-1 nova_compute[187157]: 2025-12-03 00:33:14.540 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:14 compute-1 nova_compute[187157]: 2025-12-03 00:33:14.540 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:33:16 compute-1 nova_compute[187157]: 2025-12-03 00:33:16.642 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:17 compute-1 nova_compute[187157]: 2025-12-03 00:33:17.809 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:19 compute-1 podman[227148]: 2025-12-03 00:33:19.216260257 +0000 UTC m=+0.058506135 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: ERROR   00:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: ERROR   00:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: ERROR   00:33:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: ERROR   00:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: ERROR   00:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:33:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:33:19 compute-1 nova_compute[187157]: 2025-12-03 00:33:19.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:19 compute-1 nova_compute[187157]: 2025-12-03 00:33:19.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:19 compute-1 nova_compute[187157]: 2025-12-03 00:33:19.700 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:33:21 compute-1 nova_compute[187157]: 2025-12-03 00:33:21.203 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:21 compute-1 podman[227171]: 2025-12-03 00:33:21.211771739 +0000 UTC m=+0.057822278 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:33:21 compute-1 nova_compute[187157]: 2025-12-03 00:33:21.675 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:22 compute-1 nova_compute[187157]: 2025-12-03 00:33:22.811 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:26 compute-1 nova_compute[187157]: 2025-12-03 00:33:26.677 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:27 compute-1 nova_compute[187157]: 2025-12-03 00:33:27.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:27 compute-1 nova_compute[187157]: 2025-12-03 00:33:27.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:33:27 compute-1 nova_compute[187157]: 2025-12-03 00:33:27.812 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:28 compute-1 nova_compute[187157]: 2025-12-03 00:33:28.210 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:33:30 compute-1 podman[227191]: 2025-12-03 00:33:30.227289755 +0000 UTC m=+0.074208625 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:33:31 compute-1 nova_compute[187157]: 2025-12-03 00:33:31.721 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:32 compute-1 podman[227216]: 2025-12-03 00:33:32.230161926 +0000 UTC m=+0.067412811 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 03 00:33:32 compute-1 podman[227215]: 2025-12-03 00:33:32.230223298 +0000 UTC m=+0.070548668 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 03 00:33:32 compute-1 nova_compute[187157]: 2025-12-03 00:33:32.815 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:35 compute-1 podman[197537]: time="2025-12-03T00:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:33:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:33:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:33:36 compute-1 nova_compute[187157]: 2025-12-03 00:33:36.723 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:37 compute-1 nova_compute[187157]: 2025-12-03 00:33:37.817 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:41 compute-1 nova_compute[187157]: 2025-12-03 00:33:41.725 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:42 compute-1 nova_compute[187157]: 2025-12-03 00:33:42.818 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:46 compute-1 nova_compute[187157]: 2025-12-03 00:33:46.727 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:47 compute-1 nova_compute[187157]: 2025-12-03 00:33:47.866 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:47 compute-1 nova_compute[187157]: 2025-12-03 00:33:47.988 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: ERROR   00:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: ERROR   00:33:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: ERROR   00:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: ERROR   00:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: ERROR   00:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:33:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:33:50 compute-1 podman[227259]: 2025-12-03 00:33:50.254368073 +0000 UTC m=+0.092455092 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:33:51 compute-1 nova_compute[187157]: 2025-12-03 00:33:51.771 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:52 compute-1 podman[227281]: 2025-12-03 00:33:52.261008316 +0000 UTC m=+0.088626000 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:33:52 compute-1 nova_compute[187157]: 2025-12-03 00:33:52.907 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:53 compute-1 sshd-session[227292]: Received disconnect from 217.170.199.90 port 35872:11:  [preauth]
Dec 03 00:33:53 compute-1 sshd-session[227292]: Disconnected from authenticating user root 217.170.199.90 port 35872 [preauth]
Dec 03 00:33:56 compute-1 nova_compute[187157]: 2025-12-03 00:33:56.773 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:57 compute-1 nova_compute[187157]: 2025-12-03 00:33:57.909 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:59 compute-1 nova_compute[187157]: 2025-12-03 00:33:59.211 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:00 compute-1 nova_compute[187157]: 2025-12-03 00:34:00.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:01 compute-1 podman[227303]: 2025-12-03 00:34:01.216233601 +0000 UTC m=+0.059989526 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:34:01.766 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:34:01.767 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:34:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:34:01.767 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:34:01 compute-1 nova_compute[187157]: 2025-12-03 00:34:01.778 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:02 compute-1 nova_compute[187157]: 2025-12-03 00:34:02.912 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:03 compute-1 podman[227329]: 2025-12-03 00:34:03.225535318 +0000 UTC m=+0.068536592 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:34:03 compute-1 podman[227330]: 2025-12-03 00:34:03.25821308 +0000 UTC m=+0.096667424 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:34:04 compute-1 nova_compute[187157]: 2025-12-03 00:34:04.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:05 compute-1 podman[197537]: time="2025-12-03T00:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:34:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:34:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2616 "" "Go-http-client/1.1"
Dec 03 00:34:06 compute-1 nova_compute[187157]: 2025-12-03 00:34:06.778 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:07 compute-1 nova_compute[187157]: 2025-12-03 00:34:07.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:07 compute-1 nova_compute[187157]: 2025-12-03 00:34:07.949 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.378 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.379 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.398 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.399 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=73.16084671020508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.399 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:34:08 compute-1 nova_compute[187157]: 2025-12-03 00:34:08.399 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.461 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.461 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:34:08 up  1:41,  0 user,  load average: 0.08, 0.26, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.587 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.676 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.676 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.690 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.715 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:34:09 compute-1 nova_compute[187157]: 2025-12-03 00:34:09.732 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:34:10 compute-1 nova_compute[187157]: 2025-12-03 00:34:10.239 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:34:10 compute-1 nova_compute[187157]: 2025-12-03 00:34:10.747 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:34:10 compute-1 nova_compute[187157]: 2025-12-03 00:34:10.747 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.348s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:34:11 compute-1 nova_compute[187157]: 2025-12-03 00:34:11.781 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:12 compute-1 nova_compute[187157]: 2025-12-03 00:34:12.951 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:14 compute-1 nova_compute[187157]: 2025-12-03 00:34:14.742 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:14 compute-1 nova_compute[187157]: 2025-12-03 00:34:14.742 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:14 compute-1 nova_compute[187157]: 2025-12-03 00:34:14.743 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:14 compute-1 nova_compute[187157]: 2025-12-03 00:34:14.743 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:34:16 compute-1 nova_compute[187157]: 2025-12-03 00:34:16.824 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:17 compute-1 nova_compute[187157]: 2025-12-03 00:34:17.998 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: ERROR   00:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: ERROR   00:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: ERROR   00:34:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: ERROR   00:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: ERROR   00:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:34:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:34:20 compute-1 nova_compute[187157]: 2025-12-03 00:34:20.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:21 compute-1 podman[227373]: 2025-12-03 00:34:21.22523323 +0000 UTC m=+0.065578491 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Dec 03 00:34:21 compute-1 nova_compute[187157]: 2025-12-03 00:34:21.903 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:23 compute-1 nova_compute[187157]: 2025-12-03 00:34:23.059 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:23 compute-1 podman[227394]: 2025-12-03 00:34:23.216650574 +0000 UTC m=+0.056922531 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:34:26 compute-1 nova_compute[187157]: 2025-12-03 00:34:26.905 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:28 compute-1 nova_compute[187157]: 2025-12-03 00:34:28.062 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:31 compute-1 nova_compute[187157]: 2025-12-03 00:34:31.945 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:32 compute-1 podman[227415]: 2025-12-03 00:34:32.196424753 +0000 UTC m=+0.043388853 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:34:33 compute-1 nova_compute[187157]: 2025-12-03 00:34:33.064 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:34 compute-1 podman[227440]: 2025-12-03 00:34:34.206828777 +0000 UTC m=+0.048946346 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:34:34 compute-1 podman[227441]: 2025-12-03 00:34:34.289195914 +0000 UTC m=+0.123085864 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 03 00:34:35 compute-1 podman[197537]: time="2025-12-03T00:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:34:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:34:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2613 "" "Go-http-client/1.1"
Dec 03 00:34:36 compute-1 nova_compute[187157]: 2025-12-03 00:34:36.946 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:38 compute-1 nova_compute[187157]: 2025-12-03 00:34:38.066 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:41 compute-1 nova_compute[187157]: 2025-12-03 00:34:41.948 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:43 compute-1 nova_compute[187157]: 2025-12-03 00:34:43.097 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:46 compute-1 nova_compute[187157]: 2025-12-03 00:34:46.994 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:48 compute-1 nova_compute[187157]: 2025-12-03 00:34:48.101 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: ERROR   00:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: ERROR   00:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: ERROR   00:34:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: ERROR   00:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: ERROR   00:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:34:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:34:52 compute-1 nova_compute[187157]: 2025-12-03 00:34:52.031 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:52 compute-1 podman[227480]: 2025-12-03 00:34:52.230180544 +0000 UTC m=+0.063970001 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 03 00:34:53 compute-1 nova_compute[187157]: 2025-12-03 00:34:53.154 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:54 compute-1 podman[227502]: 2025-12-03 00:34:54.23239693 +0000 UTC m=+0.075712726 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:34:57 compute-1 nova_compute[187157]: 2025-12-03 00:34:57.032 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:58 compute-1 nova_compute[187157]: 2025-12-03 00:34:58.156 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:59 compute-1 nova_compute[187157]: 2025-12-03 00:34:59.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:01 compute-1 nova_compute[187157]: 2025-12-03 00:35:01.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:35:01.767 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:35:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:35:01.768 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:35:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:35:01.768 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:35:02 compute-1 nova_compute[187157]: 2025-12-03 00:35:02.084 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:03 compute-1 nova_compute[187157]: 2025-12-03 00:35:03.159 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:03 compute-1 podman[227521]: 2025-12-03 00:35:03.20266342 +0000 UTC m=+0.046383717 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:35:05 compute-1 podman[227545]: 2025-12-03 00:35:05.206874094 +0000 UTC m=+0.053458396 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 03 00:35:05 compute-1 podman[227546]: 2025-12-03 00:35:05.242505757 +0000 UTC m=+0.085999004 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 03 00:35:05 compute-1 podman[197537]: time="2025-12-03T00:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:35:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:35:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2615 "" "Go-http-client/1.1"
Dec 03 00:35:06 compute-1 nova_compute[187157]: 2025-12-03 00:35:06.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:07 compute-1 nova_compute[187157]: 2025-12-03 00:35:07.086 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:07 compute-1 nova_compute[187157]: 2025-12-03 00:35:07.742 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:35:07.742 104348 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:35:07 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:35:07.743 104348 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:35:08 compute-1 nova_compute[187157]: 2025-12-03 00:35:08.161 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:09 compute-1 sshd-session[227589]: Invalid user solana from 45.148.10.240 port 43904
Dec 03 00:35:09 compute-1 sshd-session[227589]: Connection closed by invalid user solana 45.148.10.240 port 43904 [preauth]
Dec 03 00:35:09 compute-1 nova_compute[187157]: 2025-12-03 00:35:09.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:09 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:35:09.744 104348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e895a64d-10b7-4a6e-a7ff-0745f1562623, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.214 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.215 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.216 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.448 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.450 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.494 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.495 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5734MB free_disk=73.16084671020508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.495 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:35:10 compute-1 nova_compute[187157]: 2025-12-03 00:35:10.496 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:35:11 compute-1 nova_compute[187157]: 2025-12-03 00:35:11.538 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:35:11 compute-1 nova_compute[187157]: 2025-12-03 00:35:11.539 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:35:10 up  1:42,  0 user,  load average: 0.03, 0.21, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:35:11 compute-1 nova_compute[187157]: 2025-12-03 00:35:11.556 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:35:12 compute-1 nova_compute[187157]: 2025-12-03 00:35:12.063 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:35:12 compute-1 nova_compute[187157]: 2025-12-03 00:35:12.122 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:12 compute-1 nova_compute[187157]: 2025-12-03 00:35:12.577 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:35:12 compute-1 nova_compute[187157]: 2025-12-03 00:35:12.578 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:35:13 compute-1 nova_compute[187157]: 2025-12-03 00:35:13.204 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:16 compute-1 nova_compute[187157]: 2025-12-03 00:35:16.573 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:16 compute-1 nova_compute[187157]: 2025-12-03 00:35:16.574 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:16 compute-1 nova_compute[187157]: 2025-12-03 00:35:16.574 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:35:16 compute-1 nova_compute[187157]: 2025-12-03 00:35:16.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:17 compute-1 nova_compute[187157]: 2025-12-03 00:35:17.124 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:17 compute-1 sshd-session[227592]: Connection closed by authenticating user root 185.156.73.233 port 40948 [preauth]
Dec 03 00:35:18 compute-1 nova_compute[187157]: 2025-12-03 00:35:18.206 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: ERROR   00:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: ERROR   00:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: ERROR   00:35:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: ERROR   00:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: ERROR   00:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:35:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:35:20 compute-1 nova_compute[187157]: 2025-12-03 00:35:20.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:22 compute-1 nova_compute[187157]: 2025-12-03 00:35:22.166 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:23 compute-1 podman[227594]: 2025-12-03 00:35:23.233229385 +0000 UTC m=+0.074987729 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public)
Dec 03 00:35:23 compute-1 nova_compute[187157]: 2025-12-03 00:35:23.253 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:25 compute-1 podman[227617]: 2025-12-03 00:35:25.216062241 +0000 UTC m=+0.063818239 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:35:25 compute-1 nova_compute[187157]: 2025-12-03 00:35:25.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:27 compute-1 nova_compute[187157]: 2025-12-03 00:35:27.168 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:28 compute-1 nova_compute[187157]: 2025-12-03 00:35:28.256 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:32 compute-1 nova_compute[187157]: 2025-12-03 00:35:32.220 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:33 compute-1 nova_compute[187157]: 2025-12-03 00:35:33.259 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:34 compute-1 podman[227638]: 2025-12-03 00:35:34.244508531 +0000 UTC m=+0.073202146 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:35:35 compute-1 podman[197537]: time="2025-12-03T00:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:35:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:35:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Dec 03 00:35:36 compute-1 podman[227662]: 2025-12-03 00:35:36.230183956 +0000 UTC m=+0.074522448 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:35:36 compute-1 podman[227663]: 2025-12-03 00:35:36.274685305 +0000 UTC m=+0.103448789 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 00:35:37 compute-1 nova_compute[187157]: 2025-12-03 00:35:37.221 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:38 compute-1 nova_compute[187157]: 2025-12-03 00:35:38.261 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:42 compute-1 nova_compute[187157]: 2025-12-03 00:35:42.222 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:43 compute-1 nova_compute[187157]: 2025-12-03 00:35:43.265 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:47 compute-1 nova_compute[187157]: 2025-12-03 00:35:47.223 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:48 compute-1 nova_compute[187157]: 2025-12-03 00:35:48.267 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:48 compute-1 sshd-session[227706]: Received disconnect from 193.46.255.244 port 42820:11:  [preauth]
Dec 03 00:35:48 compute-1 sshd-session[227706]: Disconnected from authenticating user root 193.46.255.244 port 42820 [preauth]
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: ERROR   00:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: ERROR   00:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: ERROR   00:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: ERROR   00:35:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: ERROR   00:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:35:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:35:52 compute-1 nova_compute[187157]: 2025-12-03 00:35:52.224 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:53 compute-1 nova_compute[187157]: 2025-12-03 00:35:53.270 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:54 compute-1 podman[227708]: 2025-12-03 00:35:54.216566066 +0000 UTC m=+0.060675691 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Dec 03 00:35:56 compute-1 podman[227729]: 2025-12-03 00:35:56.228884518 +0000 UTC m=+0.059559025 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:35:57 compute-1 nova_compute[187157]: 2025-12-03 00:35:57.394 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:58 compute-1 nova_compute[187157]: 2025-12-03 00:35:58.273 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:59 compute-1 nova_compute[187157]: 2025-12-03 00:35:59.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:01 compute-1 nova_compute[187157]: 2025-12-03 00:36:01.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:36:01.769 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:36:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:36:01.769 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:36:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:36:01.770 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:36:02 compute-1 nova_compute[187157]: 2025-12-03 00:36:02.396 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:03 compute-1 nova_compute[187157]: 2025-12-03 00:36:03.275 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:05 compute-1 podman[227750]: 2025-12-03 00:36:05.230591328 +0000 UTC m=+0.073868471 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:36:05 compute-1 podman[197537]: time="2025-12-03T00:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:36:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:36:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2617 "" "Go-http-client/1.1"
Dec 03 00:36:07 compute-1 podman[227774]: 2025-12-03 00:36:07.208384201 +0000 UTC m=+0.054737767 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:36:07 compute-1 podman[227775]: 2025-12-03 00:36:07.320983321 +0000 UTC m=+0.158387840 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:36:07 compute-1 nova_compute[187157]: 2025-12-03 00:36:07.398 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:07 compute-1 nova_compute[187157]: 2025-12-03 00:36:07.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:08 compute-1 nova_compute[187157]: 2025-12-03 00:36:08.278 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:09 compute-1 nova_compute[187157]: 2025-12-03 00:36:09.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.271 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.271 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.272 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.272 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.445 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.446 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.461 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.461 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5761MB free_disk=73.16084671020508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.462 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:36:10 compute-1 nova_compute[187157]: 2025-12-03 00:36:10.462 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:36:11 compute-1 nova_compute[187157]: 2025-12-03 00:36:11.535 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:36:11 compute-1 nova_compute[187157]: 2025-12-03 00:36:11.535 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:36:10 up  1:43,  0 user,  load average: 0.01, 0.17, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:36:11 compute-1 nova_compute[187157]: 2025-12-03 00:36:11.575 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:36:12 compute-1 nova_compute[187157]: 2025-12-03 00:36:12.083 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:36:12 compute-1 nova_compute[187157]: 2025-12-03 00:36:12.401 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:12 compute-1 nova_compute[187157]: 2025-12-03 00:36:12.599 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:36:12 compute-1 nova_compute[187157]: 2025-12-03 00:36:12.599 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:36:13 compute-1 nova_compute[187157]: 2025-12-03 00:36:13.280 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:17 compute-1 nova_compute[187157]: 2025-12-03 00:36:17.401 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:17 compute-1 nova_compute[187157]: 2025-12-03 00:36:17.600 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:17 compute-1 nova_compute[187157]: 2025-12-03 00:36:17.600 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:17 compute-1 nova_compute[187157]: 2025-12-03 00:36:17.600 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:36:17 compute-1 nova_compute[187157]: 2025-12-03 00:36:17.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:18 compute-1 nova_compute[187157]: 2025-12-03 00:36:18.282 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: ERROR   00:36:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: ERROR   00:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: ERROR   00:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: ERROR   00:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: ERROR   00:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:36:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:36:21 compute-1 nova_compute[187157]: 2025-12-03 00:36:21.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:22 compute-1 nova_compute[187157]: 2025-12-03 00:36:22.404 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:23 compute-1 nova_compute[187157]: 2025-12-03 00:36:23.284 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:25 compute-1 podman[227820]: 2025-12-03 00:36:25.205659006 +0000 UTC m=+0.051017497 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:36:27 compute-1 podman[227841]: 2025-12-03 00:36:27.209209544 +0000 UTC m=+0.052235407 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:36:27 compute-1 nova_compute[187157]: 2025-12-03 00:36:27.406 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:28 compute-1 nova_compute[187157]: 2025-12-03 00:36:28.304 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:32 compute-1 nova_compute[187157]: 2025-12-03 00:36:32.407 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:33 compute-1 nova_compute[187157]: 2025-12-03 00:36:33.306 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:35 compute-1 podman[197537]: time="2025-12-03T00:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:36:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:36:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Dec 03 00:36:35 compute-1 podman[227862]: 2025-12-03 00:36:35.764393911 +0000 UTC m=+0.081559408 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:36:37 compute-1 nova_compute[187157]: 2025-12-03 00:36:37.409 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:38 compute-1 podman[227886]: 2025-12-03 00:36:38.230517943 +0000 UTC m=+0.073170174 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 03 00:36:38 compute-1 podman[227887]: 2025-12-03 00:36:38.232063091 +0000 UTC m=+0.070398548 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:36:38 compute-1 nova_compute[187157]: 2025-12-03 00:36:38.307 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:42 compute-1 nova_compute[187157]: 2025-12-03 00:36:42.411 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:43 compute-1 nova_compute[187157]: 2025-12-03 00:36:43.310 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:47 compute-1 nova_compute[187157]: 2025-12-03 00:36:47.414 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:48 compute-1 nova_compute[187157]: 2025-12-03 00:36:48.312 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: ERROR   00:36:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: ERROR   00:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: ERROR   00:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: ERROR   00:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: ERROR   00:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:36:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:36:52 compute-1 nova_compute[187157]: 2025-12-03 00:36:52.415 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:53 compute-1 nova_compute[187157]: 2025-12-03 00:36:53.314 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:56 compute-1 podman[227932]: 2025-12-03 00:36:56.207599498 +0000 UTC m=+0.055162457 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Dec 03 00:36:57 compute-1 nova_compute[187157]: 2025-12-03 00:36:57.417 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:58 compute-1 podman[227953]: 2025-12-03 00:36:58.209501296 +0000 UTC m=+0.052830412 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Dec 03 00:36:58 compute-1 nova_compute[187157]: 2025-12-03 00:36:58.316 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:01 compute-1 nova_compute[187157]: 2025-12-03 00:37:01.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:37:01.770 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:37:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:37:01.771 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:37:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:37:01.771 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:37:02 compute-1 nova_compute[187157]: 2025-12-03 00:37:02.418 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:02 compute-1 nova_compute[187157]: 2025-12-03 00:37:02.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:03 compute-1 nova_compute[187157]: 2025-12-03 00:37:03.318 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:05 compute-1 podman[197537]: time="2025-12-03T00:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:37:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:37:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:37:06 compute-1 podman[227976]: 2025-12-03 00:37:06.198421667 +0000 UTC m=+0.048032035 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:37:07 compute-1 nova_compute[187157]: 2025-12-03 00:37:07.419 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:07 compute-1 nova_compute[187157]: 2025-12-03 00:37:07.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:08 compute-1 nova_compute[187157]: 2025-12-03 00:37:08.320 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:09 compute-1 podman[228000]: 2025-12-03 00:37:09.215063483 +0000 UTC m=+0.060361704 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 03 00:37:09 compute-1 podman[228001]: 2025-12-03 00:37:09.268192671 +0000 UTC m=+0.103634794 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:37:10 compute-1 sshd-session[228043]: Invalid user solana from 45.148.10.240 port 52766
Dec 03 00:37:10 compute-1 sshd-session[228043]: Connection closed by invalid user solana 45.148.10.240 port 52766 [preauth]
Dec 03 00:37:11 compute-1 nova_compute[187157]: 2025-12-03 00:37:11.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.221 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.222 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.349 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.350 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.368 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.368 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5773MB free_disk=73.16084671020508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.369 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.369 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:37:12 compute-1 nova_compute[187157]: 2025-12-03 00:37:12.420 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:13 compute-1 nova_compute[187157]: 2025-12-03 00:37:13.322 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:13 compute-1 nova_compute[187157]: 2025-12-03 00:37:13.491 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:37:13 compute-1 nova_compute[187157]: 2025-12-03 00:37:13.491 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:37:12 up  1:44,  0 user,  load average: 0.00, 0.13, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:37:13 compute-1 nova_compute[187157]: 2025-12-03 00:37:13.512 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:37:14 compute-1 nova_compute[187157]: 2025-12-03 00:37:14.019 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:37:14 compute-1 nova_compute[187157]: 2025-12-03 00:37:14.528 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:37:14 compute-1 nova_compute[187157]: 2025-12-03 00:37:14.528 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.159s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:37:17 compute-1 nova_compute[187157]: 2025-12-03 00:37:17.423 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:18 compute-1 nova_compute[187157]: 2025-12-03 00:37:18.357 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:18 compute-1 nova_compute[187157]: 2025-12-03 00:37:18.524 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:18 compute-1 nova_compute[187157]: 2025-12-03 00:37:18.525 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:18 compute-1 nova_compute[187157]: 2025-12-03 00:37:18.525 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:18 compute-1 nova_compute[187157]: 2025-12-03 00:37:18.525 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: ERROR   00:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: ERROR   00:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: ERROR   00:37:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: ERROR   00:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: ERROR   00:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:37:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:37:22 compute-1 nova_compute[187157]: 2025-12-03 00:37:22.424 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:22 compute-1 nova_compute[187157]: 2025-12-03 00:37:22.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:23 compute-1 nova_compute[187157]: 2025-12-03 00:37:23.359 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:25 compute-1 nova_compute[187157]: 2025-12-03 00:37:25.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:27 compute-1 podman[228046]: 2025-12-03 00:37:27.233233653 +0000 UTC m=+0.070657444 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 03 00:37:27 compute-1 nova_compute[187157]: 2025-12-03 00:37:27.425 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:28 compute-1 nova_compute[187157]: 2025-12-03 00:37:28.418 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:29 compute-1 podman[228069]: 2025-12-03 00:37:29.203216677 +0000 UTC m=+0.051041738 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:37:32 compute-1 nova_compute[187157]: 2025-12-03 00:37:32.427 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:33 compute-1 nova_compute[187157]: 2025-12-03 00:37:33.466 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:35 compute-1 podman[197537]: time="2025-12-03T00:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:37:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:37:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2614 "" "Go-http-client/1.1"
Dec 03 00:37:37 compute-1 podman[228090]: 2025-12-03 00:37:37.204445376 +0000 UTC m=+0.050509876 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:37:37 compute-1 nova_compute[187157]: 2025-12-03 00:37:37.429 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:38 compute-1 nova_compute[187157]: 2025-12-03 00:37:38.523 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:40 compute-1 podman[228114]: 2025-12-03 00:37:40.216416899 +0000 UTC m=+0.060628730 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 03 00:37:40 compute-1 podman[228115]: 2025-12-03 00:37:40.260449336 +0000 UTC m=+0.100343892 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 03 00:37:42 compute-1 nova_compute[187157]: 2025-12-03 00:37:42.431 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:43 compute-1 nova_compute[187157]: 2025-12-03 00:37:43.525 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:47 compute-1 nova_compute[187157]: 2025-12-03 00:37:47.432 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:48 compute-1 nova_compute[187157]: 2025-12-03 00:37:48.527 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: ERROR   00:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: ERROR   00:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: ERROR   00:37:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: ERROR   00:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: ERROR   00:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:37:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:37:52 compute-1 nova_compute[187157]: 2025-12-03 00:37:52.481 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:53 compute-1 nova_compute[187157]: 2025-12-03 00:37:53.548 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:57 compute-1 nova_compute[187157]: 2025-12-03 00:37:57.483 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:58 compute-1 podman[228158]: 2025-12-03 00:37:58.22832658 +0000 UTC m=+0.067643590 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:37:58 compute-1 nova_compute[187157]: 2025-12-03 00:37:58.551 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:00 compute-1 podman[228181]: 2025-12-03 00:38:00.216306441 +0000 UTC m=+0.055445535 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:38:01 compute-1 nova_compute[187157]: 2025-12-03 00:38:01.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:01 compute-1 nova_compute[187157]: 2025-12-03 00:38:01.701 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:38:01.772 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:38:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:38:01.772 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:38:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:38:01.772 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:38:02 compute-1 nova_compute[187157]: 2025-12-03 00:38:02.484 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:03 compute-1 nova_compute[187157]: 2025-12-03 00:38:03.214 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:03 compute-1 nova_compute[187157]: 2025-12-03 00:38:03.555 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:05 compute-1 podman[197537]: time="2025-12-03T00:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:38:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:38:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:38:07 compute-1 nova_compute[187157]: 2025-12-03 00:38:07.543 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:08 compute-1 podman[228203]: 2025-12-03 00:38:08.235917555 +0000 UTC m=+0.073535714 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:38:08 compute-1 nova_compute[187157]: 2025-12-03 00:38:08.557 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:09 compute-1 nova_compute[187157]: 2025-12-03 00:38:09.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:11 compute-1 podman[228227]: 2025-12-03 00:38:11.202848346 +0000 UTC m=+0.045444173 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:38:11 compute-1 podman[228228]: 2025-12-03 00:38:11.237735032 +0000 UTC m=+0.074989630 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:38:12 compute-1 nova_compute[187157]: 2025-12-03 00:38:12.545 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:12 compute-1 nova_compute[187157]: 2025-12-03 00:38:12.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.216 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.217 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.217 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.339 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.339 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.356 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.356 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5763MB free_disk=73.15480422973633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.357 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.357 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:38:13 compute-1 nova_compute[187157]: 2025-12-03 00:38:13.559 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:14 compute-1 nova_compute[187157]: 2025-12-03 00:38:14.398 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:38:14 compute-1 nova_compute[187157]: 2025-12-03 00:38:14.399 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:38:13 up  1:45,  0 user,  load average: 0.00, 0.11, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:38:14 compute-1 nova_compute[187157]: 2025-12-03 00:38:14.416 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:38:14 compute-1 nova_compute[187157]: 2025-12-03 00:38:14.922 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:38:15 compute-1 nova_compute[187157]: 2025-12-03 00:38:15.434 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:38:15 compute-1 nova_compute[187157]: 2025-12-03 00:38:15.434 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.078s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:38:17 compute-1 nova_compute[187157]: 2025-12-03 00:38:17.592 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:18 compute-1 nova_compute[187157]: 2025-12-03 00:38:18.563 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: ERROR   00:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: ERROR   00:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: ERROR   00:38:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: ERROR   00:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: ERROR   00:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:38:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:38:19 compute-1 nova_compute[187157]: 2025-12-03 00:38:19.430 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:19 compute-1 nova_compute[187157]: 2025-12-03 00:38:19.431 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:19 compute-1 nova_compute[187157]: 2025-12-03 00:38:19.431 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:38:19 compute-1 nova_compute[187157]: 2025-12-03 00:38:19.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:22 compute-1 nova_compute[187157]: 2025-12-03 00:38:22.592 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:22 compute-1 nova_compute[187157]: 2025-12-03 00:38:22.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:23 compute-1 nova_compute[187157]: 2025-12-03 00:38:23.603 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:27 compute-1 nova_compute[187157]: 2025-12-03 00:38:27.594 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:28 compute-1 nova_compute[187157]: 2025-12-03 00:38:28.606 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:29 compute-1 podman[228272]: 2025-12-03 00:38:29.207369216 +0000 UTC m=+0.052420281 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Dec 03 00:38:30 compute-1 nova_compute[187157]: 2025-12-03 00:38:30.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:30 compute-1 nova_compute[187157]: 2025-12-03 00:38:30.701 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:38:31 compute-1 podman[228293]: 2025-12-03 00:38:31.204513868 +0000 UTC m=+0.048946798 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:38:31 compute-1 nova_compute[187157]: 2025-12-03 00:38:31.208 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:38:31 compute-1 nova_compute[187157]: 2025-12-03 00:38:31.208 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:31 compute-1 nova_compute[187157]: 2025-12-03 00:38:31.208 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:38:32 compute-1 nova_compute[187157]: 2025-12-03 00:38:32.596 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:33 compute-1 nova_compute[187157]: 2025-12-03 00:38:33.648 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:35 compute-1 podman[197537]: time="2025-12-03T00:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:38:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:38:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2620 "" "Go-http-client/1.1"
Dec 03 00:38:37 compute-1 nova_compute[187157]: 2025-12-03 00:38:37.641 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:38 compute-1 nova_compute[187157]: 2025-12-03 00:38:38.650 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:39 compute-1 podman[228313]: 2025-12-03 00:38:39.214524031 +0000 UTC m=+0.055347612 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:38:42 compute-1 podman[228339]: 2025-12-03 00:38:42.199598163 +0000 UTC m=+0.047515583 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:38:42 compute-1 podman[228340]: 2025-12-03 00:38:42.307228863 +0000 UTC m=+0.150137330 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:38:42 compute-1 nova_compute[187157]: 2025-12-03 00:38:42.643 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:43 compute-1 nova_compute[187157]: 2025-12-03 00:38:43.705 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:47 compute-1 nova_compute[187157]: 2025-12-03 00:38:47.679 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:48 compute-1 nova_compute[187157]: 2025-12-03 00:38:48.745 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: ERROR   00:38:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: ERROR   00:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: ERROR   00:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: ERROR   00:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: ERROR   00:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:38:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:38:52 compute-1 nova_compute[187157]: 2025-12-03 00:38:52.680 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:53 compute-1 nova_compute[187157]: 2025-12-03 00:38:53.777 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:57 compute-1 nova_compute[187157]: 2025-12-03 00:38:57.683 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:58 compute-1 nova_compute[187157]: 2025-12-03 00:38:58.780 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:00 compute-1 podman[228386]: 2025-12-03 00:39:00.236421906 +0000 UTC m=+0.075379887 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:39:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:39:01.773 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:39:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:39:01.773 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:39:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:39:01.773 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:39:02 compute-1 podman[228408]: 2025-12-03 00:39:02.206262928 +0000 UTC m=+0.047330259 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Dec 03 00:39:02 compute-1 nova_compute[187157]: 2025-12-03 00:39:02.723 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:03 compute-1 nova_compute[187157]: 2025-12-03 00:39:03.818 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:04 compute-1 nova_compute[187157]: 2025-12-03 00:39:04.714 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:04 compute-1 nova_compute[187157]: 2025-12-03 00:39:04.715 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:05 compute-1 podman[197537]: time="2025-12-03T00:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:39:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:39:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Dec 03 00:39:07 compute-1 nova_compute[187157]: 2025-12-03 00:39:07.725 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:08 compute-1 nova_compute[187157]: 2025-12-03 00:39:08.820 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:10 compute-1 podman[228428]: 2025-12-03 00:39:10.207213039 +0000 UTC m=+0.050379642 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:39:10 compute-1 nova_compute[187157]: 2025-12-03 00:39:10.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:12 compute-1 nova_compute[187157]: 2025-12-03 00:39:12.773 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:13 compute-1 podman[228453]: 2025-12-03 00:39:13.209165831 +0000 UTC m=+0.051730255 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 03 00:39:13 compute-1 podman[228454]: 2025-12-03 00:39:13.247255634 +0000 UTC m=+0.086919968 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Dec 03 00:39:13 compute-1 nova_compute[187157]: 2025-12-03 00:39:13.860 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:13 compute-1 sshd-session[228495]: Invalid user solana from 45.148.10.240 port 48702
Dec 03 00:39:13 compute-1 sshd-session[228495]: Connection closed by invalid user solana 45.148.10.240 port 48702 [preauth]
Dec 03 00:39:14 compute-1 nova_compute[187157]: 2025-12-03 00:39:14.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.224 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.224 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.224 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.224 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.350 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.351 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.367 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.368 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5764MB free_disk=73.15480422973633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.368 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:39:15 compute-1 nova_compute[187157]: 2025-12-03 00:39:15.369 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.481 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.481 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:39:15 up  1:46,  0 user,  load average: 0.00, 0.08, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.501 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing inventories for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.566 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating ProviderTree inventory for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.567 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Updating inventory in ProviderTree for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.581 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing aggregate associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.605 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Refreshing trait associations for resource provider a6c5ccbf-f26a-4e87-95da-56336ae0b343, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ARCH_X86_64,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:39:16 compute-1 nova_compute[187157]: 2025-12-03 00:39:16.622 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:39:17 compute-1 nova_compute[187157]: 2025-12-03 00:39:17.128 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:39:17 compute-1 nova_compute[187157]: 2025-12-03 00:39:17.637 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:39:17 compute-1 nova_compute[187157]: 2025-12-03 00:39:17.637 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.269s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:39:17 compute-1 nova_compute[187157]: 2025-12-03 00:39:17.774 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:18 compute-1 nova_compute[187157]: 2025-12-03 00:39:18.912 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: ERROR   00:39:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: ERROR   00:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: ERROR   00:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: ERROR   00:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: ERROR   00:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:39:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:39:21 compute-1 nova_compute[187157]: 2025-12-03 00:39:21.633 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:21 compute-1 nova_compute[187157]: 2025-12-03 00:39:21.633 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:21 compute-1 nova_compute[187157]: 2025-12-03 00:39:21.633 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:21 compute-1 nova_compute[187157]: 2025-12-03 00:39:21.633 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:39:22 compute-1 nova_compute[187157]: 2025-12-03 00:39:22.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:22 compute-1 nova_compute[187157]: 2025-12-03 00:39:22.777 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:23 compute-1 nova_compute[187157]: 2025-12-03 00:39:23.961 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:27 compute-1 nova_compute[187157]: 2025-12-03 00:39:27.779 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:29 compute-1 nova_compute[187157]: 2025-12-03 00:39:29.005 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:30 compute-1 nova_compute[187157]: 2025-12-03 00:39:30.696 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:31 compute-1 podman[228498]: 2025-12-03 00:39:31.232062286 +0000 UTC m=+0.068097192 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Dec 03 00:39:32 compute-1 nova_compute[187157]: 2025-12-03 00:39:32.789 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:33 compute-1 podman[228520]: 2025-12-03 00:39:33.237215143 +0000 UTC m=+0.064890794 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 03 00:39:34 compute-1 nova_compute[187157]: 2025-12-03 00:39:34.049 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:35 compute-1 podman[197537]: time="2025-12-03T00:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:39:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:39:35 compute-1 podman[197537]: @ - - [03/Dec/2025:00:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2617 "" "Go-http-client/1.1"
Dec 03 00:39:37 compute-1 nova_compute[187157]: 2025-12-03 00:39:37.791 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:39 compute-1 nova_compute[187157]: 2025-12-03 00:39:39.053 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:41 compute-1 podman[228540]: 2025-12-03 00:39:41.259336989 +0000 UTC m=+0.090818982 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:39:42 compute-1 nova_compute[187157]: 2025-12-03 00:39:42.798 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:44 compute-1 nova_compute[187157]: 2025-12-03 00:39:44.106 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:44 compute-1 podman[228565]: 2025-12-03 00:39:44.239949122 +0000 UTC m=+0.071262168 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:39:44 compute-1 podman[228566]: 2025-12-03 00:39:44.308250148 +0000 UTC m=+0.139913983 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:39:47 compute-1 nova_compute[187157]: 2025-12-03 00:39:47.800 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:49 compute-1 nova_compute[187157]: 2025-12-03 00:39:49.109 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: ERROR   00:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: ERROR   00:39:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: ERROR   00:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: ERROR   00:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: ERROR   00:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:39:49 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:39:52 compute-1 nova_compute[187157]: 2025-12-03 00:39:52.803 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:54 compute-1 nova_compute[187157]: 2025-12-03 00:39:54.159 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:57 compute-1 nova_compute[187157]: 2025-12-03 00:39:57.806 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:59 compute-1 nova_compute[187157]: 2025-12-03 00:39:59.161 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:40:01.774 104348 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:40:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:40:01.774 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:40:01 compute-1 ovn_metadata_agent[104343]: 2025-12-03 00:40:01.774 104348 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:40:02 compute-1 podman[228612]: 2025-12-03 00:40:02.213922899 +0000 UTC m=+0.056568503 container health_status c34210186f5df8a3be8dcfa58dc0f7e3f10ba90c5bdb62bb3109f87345b1943f (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, config_id=edpm)
Dec 03 00:40:02 compute-1 nova_compute[187157]: 2025-12-03 00:40:02.807 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:03 compute-1 nova_compute[187157]: 2025-12-03 00:40:03.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:04 compute-1 nova_compute[187157]: 2025-12-03 00:40:04.163 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:04 compute-1 podman[228633]: 2025-12-03 00:40:04.231177519 +0000 UTC m=+0.066900442 container health_status 135428eb2dc605c94d7b7f8472bdc4c221a76081a04c1de3fb0ba61fc5b7d1f7 (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:40:05 compute-1 podman[197537]: time="2025-12-03T00:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:40:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:40:05 compute-1 podman[197537]: @ - - [03/Dec/2025:00:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2617 "" "Go-http-client/1.1"
Dec 03 00:40:05 compute-1 nova_compute[187157]: 2025-12-03 00:40:05.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:07 compute-1 nova_compute[187157]: 2025-12-03 00:40:07.810 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:09 compute-1 nova_compute[187157]: 2025-12-03 00:40:09.166 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:10 compute-1 nova_compute[187157]: 2025-12-03 00:40:10.699 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:12 compute-1 podman[228653]: 2025-12-03 00:40:12.25640644 +0000 UTC m=+0.086136489 container health_status ccccdb66d4c8e13af604c83473c72b08ac50e9a6f1ceb444b985a0d526a31b28 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:40:12 compute-1 nova_compute[187157]: 2025-12-03 00:40:12.811 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:14 compute-1 nova_compute[187157]: 2025-12-03 00:40:14.168 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:14 compute-1 nova_compute[187157]: 2025-12-03 00:40:14.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:15 compute-1 podman[228677]: 2025-12-03 00:40:15.225350531 +0000 UTC m=+0.059438483 container health_status 7b5cbaa65b5079196099f6037d0bddefd5aba2bf7d33ca26aaa8483791231633 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.225 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.225 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.225 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.226 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:40:15 compute-1 podman[228678]: 2025-12-03 00:40:15.257375376 +0000 UTC m=+0.094738577 container health_status a12625fce79561a401052ba61c6b8bd54f70d6c0ef137ee550b8f1d38a02ecda (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.376 187161 WARNING nova.virt.libvirt.driver [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.377 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.415 187161 DEBUG oslo_concurrency.processutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.416 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5747MB free_disk=73.15472412109375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.416 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:40:15 compute-1 nova_compute[187157]: 2025-12-03 00:40:15.417 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:40:15 compute-1 sshd-session[228721]: Connection closed by authenticating user root 143.198.96.196 port 55752 [preauth]
Dec 03 00:40:16 compute-1 nova_compute[187157]: 2025-12-03 00:40:16.469 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:40:16 compute-1 nova_compute[187157]: 2025-12-03 00:40:16.470 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:40:15 up  1:47,  0 user,  load average: 0.00, 0.07, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:40:16 compute-1 nova_compute[187157]: 2025-12-03 00:40:16.490 187161 DEBUG nova.compute.provider_tree [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed in ProviderTree for provider: a6c5ccbf-f26a-4e87-95da-56336ae0b343 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:40:17 compute-1 nova_compute[187157]: 2025-12-03 00:40:17.002 187161 DEBUG nova.scheduler.client.report [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Inventory has not changed for provider a6c5ccbf-f26a-4e87-95da-56336ae0b343 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:40:17 compute-1 nova_compute[187157]: 2025-12-03 00:40:17.515 187161 DEBUG nova.compute.resource_tracker [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:40:17 compute-1 nova_compute[187157]: 2025-12-03 00:40:17.515 187161 DEBUG oslo_concurrency.lockutils [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:40:17 compute-1 nova_compute[187157]: 2025-12-03 00:40:17.813 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:19 compute-1 sshd-session[228723]: Accepted publickey for zuul from 192.168.122.10 port 40390 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:40:19 compute-1 systemd-logind[790]: New session 42 of user zuul.
Dec 03 00:40:19 compute-1 systemd[1]: Started Session 42 of User zuul.
Dec 03 00:40:19 compute-1 sshd-session[228723]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:40:19 compute-1 sudo[228727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 03 00:40:19 compute-1 sudo[228727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:40:19 compute-1 nova_compute[187157]: 2025-12-03 00:40:19.192 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: ERROR   00:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: ERROR   00:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: ERROR   00:40:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: ERROR   00:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: ERROR   00:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:40:19 compute-1 openstack_network_exporter[199685]: 
Dec 03 00:40:22 compute-1 nova_compute[187157]: 2025-12-03 00:40:22.511 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:22 compute-1 nova_compute[187157]: 2025-12-03 00:40:22.511 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:22 compute-1 nova_compute[187157]: 2025-12-03 00:40:22.511 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:22 compute-1 nova_compute[187157]: 2025-12-03 00:40:22.511 187161 DEBUG nova.compute.manager [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:40:22 compute-1 nova_compute[187157]: 2025-12-03 00:40:22.850 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:23 compute-1 ovs-vsctl[228899]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 03 00:40:23 compute-1 nova_compute[187157]: 2025-12-03 00:40:23.700 187161 DEBUG oslo_service.periodic_task [None req-a59e8e04-1c2c-46f5-b46b-2db13495fa48 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:24 compute-1 nova_compute[187157]: 2025-12-03 00:40:24.196 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:24 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 03 00:40:24 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 03 00:40:24 compute-1 virtqemud[186882]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 00:40:25 compute-1 crontab[229312]: (root) LIST (root)
Dec 03 00:40:27 compute-1 systemd[1]: Starting Hostname Service...
Dec 03 00:40:27 compute-1 systemd[1]: Started Hostname Service.
Dec 03 00:40:27 compute-1 nova_compute[187157]: 2025-12-03 00:40:27.851 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:29 compute-1 nova_compute[187157]: 2025-12-03 00:40:29.245 187161 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
